diff --git a/docs/superpowers/plans/2026-05-05-e2e-upload-test-suite.md b/docs/superpowers/plans/2026-05-05-e2e-upload-test-suite.md new file mode 100644 index 000000000..1c4ea851b --- /dev/null +++ b/docs/superpowers/plans/2026-05-05-e2e-upload-test-suite.md @@ -0,0 +1,1104 @@ +# E2E Upload Test Suite — Simulator Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Add a `simulate` CLI subcommand to runwhen-local that, given an SLX-keyed YAML test config and a valid `uploadInfo.yaml`, produces a workspace upload archive identical in shape to a real workspace builder run and POSTs it to PAPI — without needing a Kubernetes cluster, cloud auth, or real codecollections. + +**Architecture:** A new server-side INDEXER component `test_synth` registered in the existing component framework synthesizes deterministic resources from a YAML config. A passthrough generation rule (`_test-passthrough.yaml`) matches those resources and produces SLXs. Three minimal Jinja templates render schema-valid runbook/sli/slo files. The new `simulate` subcommand on `run.py` wires this into the existing `/run/` REST flow and reuses the existing upload code path unchanged. + +**Tech Stack:** Python 3, Django REST framework (existing workspace builder service), Jinja2 templates, Django TestCase for integration tests. + +**Deviation from spec — file layout:** The spec proposed a new `src/simulator/` package containing the component, templates, and rules. After reading `src/component.py:240–268`, this is incompatible with the existing component framework: `init_components()` imports modules using a hardcoded `f"{stage.module_name}.{component_name}"` pattern where `stage.module_name` is one of `indexers`, `enrichers`, `renderers`. A new top-level package would be ignored. The plan therefore places the indexer at `src/indexers/test_synth.py`, the templates in `src/templates/`, and the rule in `src/map-customization-rules/` (the existing canonical locations). The architecture and behavior described in the spec are unchanged; only the on-disk paths differ. + +--- + +## File Structure + +**New files:** + +- `src/indexers/test_synth.py` — the indexer; reads the `testConfig` setting, parses the YAML, registers `TestResource` instances into the resource registry. +- `src/map-customization-rules/_test-passthrough.yaml` — a regular generation rule that matches `TestResource` and emits one SLX per resource. Always loaded; harmless on non-test runs because no `TestResource` instances exist. +- `src/templates/test-runbook.yaml` — Jinja template producing a minimal valid Runbook YAML. +- `src/templates/test-sli.yaml` — Jinja template producing a minimal valid SLI YAML. +- `src/templates/test-slo.yaml` — Jinja template producing a minimal valid SLO YAML. +- `src/workspace_builder/tests_simulator.py` — Django integration tests for the simulator pipeline. + +**Modified files:** + +- `src/component.py` — register `test_synth` in `init_components()` (line 251). +- `src/run.py` — add `SIMULATE_COMMAND = 'simulate'`, accept it in the choices list (line 204), parse `--config`, build request_data with the simulator components and `testConfig` setting, reuse the existing upload code path. + +**Why this layout:** Components in this codebase live in `src/{indexers,enrichers,renderers}/` per the framework convention enforced by `src/component.py:init_components`. There is no `src/components/` or `src/simulator/` directory — adding one would conflict with the existing import logic at `src/component.py:259` (`f"{stage.module_name}.{component_name}"`). Generation rules live in `src/map-customization-rules/`. Templates live in `src/templates/` so the existing Jinja loader picks them up without configuration changes (this also lets the test templates `{% include "common-labels.yaml" %}` for free). + +--- + +## Task 1: Scaffold and register a no-op `test_synth` component + +**Files:** +- Create: `src/indexers/test_synth.py` +- Modify: `src/component.py:251` +- Test: `src/workspace_builder/tests_simulator.py` + +- [ ] **Step 1: Write the failing test** + +Create `src/workspace_builder/tests_simulator.py`: + +```python +import json +from http import HTTPStatus + +from django.test import TestCase + + +class SimulatorTestCase(TestCase): + def test_test_synth_runs_as_noop(self): + """test_synth alone should run without error and produce no SLXs.""" + request_data = { + "components": "test_synth", + "workspaceName": "ws-noop", + "papiURL": "http://papi.local", + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) +``` + +- [ ] **Step 2: Run test to verify it fails** + +Run: +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::SimulatorTestCase::test_test_synth_runs_as_noop -v +``` + +Expected: FAIL with `WorkspaceBuilderObjectNotFoundException: component test_synth` (raised by `get_component` at `src/component.py:281`). + +- [ ] **Step 3: Implement minimal `test_synth.py`** + +Create `src/indexers/test_synth.py`: + +```python +from component import Context, Setting, SettingDependency + +DOCUMENTATION = "Synthesize deterministic test resources for E2E upload tests" + +TEST_CONFIG_SETTING = Setting( + "TEST_CONFIG", + "testConfig", + Setting.Type.STRING, + "YAML string describing SLXs to synthesize for the simulator pipeline", +) + +SETTINGS = ( + SettingDependency(TEST_CONFIG_SETTING, False), +) + + +def index(context: Context): + # No-op for now; real synthesis lands in Task 2. + return +``` + +- [ ] **Step 4: Register the component** + +Modify `src/component.py` at line 251 inside `init_components()`: + +```python +component_stages_init = ( + (Stage.INDEXER, ["load_resources", "kubeapi", "cloudquery", "azure_devops", "test_synth"]), + (Stage.ENRICHER, ["generation_rules"]), + (Stage.RENDERER, ["render_output_items", "dump_resources"]) +) +``` + +- [ ] **Step 5: Run test to verify it passes** + +Run: +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::SimulatorTestCase::test_test_synth_runs_as_noop -v +``` + +Expected: PASS. + +- [ ] **Step 6: Commit** + +```bash +git add src/indexers/test_synth.py src/component.py src/workspace_builder/tests_simulator.py +git commit -m "feat(simulator): scaffold test_synth indexer as no-op" +``` + +--- + +## Task 2: TestResource type + synthesis logic + +**Files:** +- Modify: `src/indexers/test_synth.py` +- Test: `src/workspace_builder/tests_simulator.py` + +- [ ] **Step 1: Write the failing test** + +Append to `src/workspace_builder/tests_simulator.py`: + +```python +TEST_CONFIG_YAML_BASIC = """ +slxs: + my-app-ops: + levelOfDetail: detailed + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + runbook: + commands: ["echo hello"] +""" + + +class TestSynthSynthesisTestCase(TestCase): + def test_synthesizes_one_resource_per_slx(self): + """test_synth should register one resource per slxs entry.""" + request_data = { + "components": "test_synth,dump_resources", + "workspaceName": "ws-synth", + "papiURL": "http://papi.local", + "testConfig": TEST_CONFIG_YAML_BASIC, + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + # dump_resources writes the resource registry into the response archive. + # Decode and assert that exactly one TestResource named "my-app-ops" exists. + from base64 import b64decode + import io, tarfile, yaml + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + dump_member = next(m for m in archive.getmembers() if m.name.endswith("resource_dump.yaml")) + dump_text = archive.extractfile(dump_member).read().decode("utf-8") + # The YAML uses custom tags (!Registry, !ResourceType); just assert the + # expected SLX slug appears in the dump as a resource name. + self.assertIn("my-app-ops", dump_text) + self.assertIn("test", dump_text) # platform name +``` + +- [ ] **Step 2: Run test to verify it fails** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::TestSynthSynthesisTestCase -v +``` + +Expected: FAIL — the resource dump won't contain `my-app-ops` because `index()` is still a no-op. + +- [ ] **Step 3: Implement TestResource synthesis** + +Replace contents of `src/indexers/test_synth.py`: + +```python +import yaml + +from component import Context, Setting, SettingDependency +from resources import ( + Registry, REGISTRY_PROPERTY_NAME, Platform, ResourceType, Resource +) + +DOCUMENTATION = "Synthesize deterministic test resources for E2E upload tests" + +TEST_PLATFORM_NAME = "test" +TEST_RESOURCE_TYPE_NAME = "test_resource" + +TEST_CONFIG_SETTING = Setting( + "TEST_CONFIG", + "testConfig", + Setting.Type.STRING, + "YAML string describing SLXs to synthesize for the simulator pipeline", +) + +SETTINGS = ( + SettingDependency(TEST_CONFIG_SETTING, False), +) + + +def _resource_attrs_from_slx_entry(slug: str, entry: dict) -> dict: + """Project a test config SLX entry into the attribute dict for a TestResource.""" + return { + "slx_slug": slug, + "level_of_detail": entry.get("levelOfDetail", "basic"), + "code_collection": entry["codeCollection"], + "code_bundle": entry["codeBundle"], + "runbook": entry.get("runbook") or {}, + "sli": entry.get("sli") or None, + "slo": entry.get("slo") or None, + } + + +def index(context: Context): + config_text = context.get_setting(TEST_CONFIG_SETTING) + if not config_text: + return + + config = yaml.safe_load(config_text) or {} + slxs = config.get("slxs") or {} + + registry: Registry = context.get_property(REGISTRY_PROPERTY_NAME) + if registry is None: + registry = Registry({}) + context.set_property(REGISTRY_PROPERTY_NAME, registry) + + platform = registry.platforms.get(TEST_PLATFORM_NAME) + if platform is None: + platform = Platform(TEST_PLATFORM_NAME) + registry.platforms[TEST_PLATFORM_NAME] = platform + + resource_type = platform.resource_types.get(TEST_RESOURCE_TYPE_NAME) + if resource_type is None: + resource_type = ResourceType(TEST_RESOURCE_TYPE_NAME, platform) + platform.resource_types[TEST_RESOURCE_TYPE_NAME] = resource_type + + for slug, entry in slxs.items(): + attrs = _resource_attrs_from_slx_entry(slug, entry) + resource = Resource(name=slug, resource_type=resource_type, attributes=attrs) + resource_type.instances[slug] = resource + resource_type.update_custom_attributes(set(attrs.keys())) +``` + +> **Note on `Resource` constructor signature:** Read `src/resources.py` lines 1–40 first to confirm the `Resource` class signature. The example above assumes `Resource(name, resource_type, attributes)` — adjust `attrs` field name (e.g., `properties` vs. `attributes`) based on what `Resource.__init__` actually accepts. If the existing code uses different field names, mirror those. + +- [ ] **Step 4: Run test to verify it passes** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::TestSynthSynthesisTestCase -v +``` + +Expected: PASS. The dump contains `my-app-ops` and the `test` platform name. + +- [ ] **Step 5: Commit** + +```bash +git add src/indexers/test_synth.py src/workspace_builder/tests_simulator.py +git commit -m "feat(simulator): synthesize TestResource instances from testConfig YAML" +``` + +--- + +## Task 3: Passthrough generation rule + +**Files:** +- Create: `src/map-customization-rules/_test-passthrough.yaml` +- Test: `src/workspace_builder/tests_simulator.py` + +- [ ] **Step 1: Write the failing test** + +Append to `src/workspace_builder/tests_simulator.py`: + +```python +class PassthroughGenerationRuleTestCase(TestCase): + def test_passthrough_rule_produces_one_slx_per_resource(self): + request_data = { + "components": "test_synth,generation_rules", + "workspaceName": "ws-rule", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": TEST_CONFIG_YAML_BASIC, + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + response_data = json.loads(response.content) + # generation_rules sets a property listing the SLXs it produced; if it + # doesn't surface in the response, fall back to checking response_data + # for a non-empty SLX list. Adapt the assertion to whatever the existing + # generation_rules component exposes. + message = response_data.get("message", "") + self.assertIn("Total SLXs: 1", message) +``` + +- [ ] **Step 2: Run test to verify it fails** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::PassthroughGenerationRuleTestCase -v +``` + +Expected: FAIL — `Total SLXs: 0` because no rule matches the new resource type. + +- [ ] **Step 3: Add the passthrough generation rule** + +Create `src/map-customization-rules/_test-passthrough.yaml`: + +```yaml +# Internal-only generation rule used by the simulator (test_synth indexer). +# Matches TestResource instances synthesized from a testConfig YAML and emits +# one SLX per resource. Harmless when no TestResource instances exist (i.e., +# normal workspace builder runs). +platform: test +generationRules: + - resourceTypes: + - test_resource + matchRules: + - type: pattern + pattern: ".*" + properties: ["slx_slug"] + mode: substring + slxs: + - baseName: "{{ match_resource.slx_slug }}" + levelOfDetail: "{{ match_resource.level_of_detail }}" + outputItems: + - type: runbook + templateName: test-runbook.yaml +``` + +> **Note:** The exact Jinja syntax for accessing a matched resource's attributes (`match_resource.slx_slug` vs. `resource.slx_slug` vs. another name) depends on the existing generation rules' templating context. Read one of the existing rules in `src/map-customization-rules/` (e.g., `argocd-application.yaml`) to confirm the variable name. Adjust this file's templating accordingly. + +- [ ] **Step 4: Run test to verify it passes** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::PassthroughGenerationRuleTestCase -v +``` + +Expected: PASS — `Total SLXs: 1`. + +- [ ] **Step 5: Commit** + +```bash +git add src/map-customization-rules/_test-passthrough.yaml src/workspace_builder/tests_simulator.py +git commit -m "feat(simulator): passthrough generation rule for TestResource" +``` + +--- + +## Task 4: Minimal simulator templates + +**Files:** +- Create: `src/templates/test-runbook.yaml` +- Test: `src/workspace_builder/tests_simulator.py` + +- [ ] **Step 1: Write the failing test** + +Append to `src/workspace_builder/tests_simulator.py`: + +```python +class SimulatorRenderTestCase(TestCase): + def test_runbook_yaml_is_rendered_and_contains_expected_labels(self): + request_data = { + "components": "test_synth,generation_rules,render_output_items", + "workspaceName": "ws-render", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": TEST_CONFIG_YAML_BASIC, + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + + from base64 import b64decode + import io, tarfile, yaml + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + + runbook_member = next( + m for m in archive.getmembers() + if m.name.endswith("/my-app-ops/runbook.yaml") + ) + runbook_text = archive.extractfile(runbook_member).read().decode("utf-8") + runbook_doc = yaml.safe_load(runbook_text) + + self.assertEqual(runbook_doc["kind"], "Runbook") + self.assertEqual( + runbook_doc["metadata"]["labels"]["codeCollection"], + "rw-cli-codecollection", + ) + self.assertEqual( + runbook_doc["metadata"]["labels"]["codeBundle"], + "k8s-deployment-ops", + ) + self.assertEqual(runbook_doc["spec"]["commands"], ["echo hello"]) +``` + +- [ ] **Step 2: Run test to verify it fails** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::SimulatorRenderTestCase -v +``` + +Expected: FAIL — template `test-runbook.yaml` doesn't exist. + +- [ ] **Step 3: Create the runbook template** + +Create `src/templates/test-runbook.yaml`: + +```yaml +apiVersion: runwhen.com/v1 +kind: Runbook +metadata: + name: {{ slx.full_name }} + labels: + codeCollection: {{ match_resource.code_collection }} + codeBundle: {{ match_resource.code_bundle }} +{% include "common-labels.yaml" %} +spec: +{{ match_resource.runbook | to_nice_yaml | indent(2, true) }} +``` + +> **Note:** Confirm the Jinja filter name `to_nice_yaml` exists in the project's Jinja environment by searching existing templates: `grep -rn "to_nice_yaml\|to_yaml" src/templates`. If neither exists, render the runbook subdict by iterating its keys explicitly, or use the `{{ ... | tojson }}` filter and rely on YAML being a superset of JSON. + +- [ ] **Step 4: Run test to verify it passes** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::SimulatorRenderTestCase -v +``` + +Expected: PASS. + +- [ ] **Step 5: Commit** + +```bash +git add src/templates/test-runbook.yaml src/workspace_builder/tests_simulator.py +git commit -m "feat(simulator): minimal runbook template producing schema-valid YAML" +``` + +--- + +## Task 5: SLI and SLO templates with conditional rendering + +**Files:** +- Create: `src/templates/test-sli.yaml`, `src/templates/test-slo.yaml` +- Modify: `src/map-customization-rules/_test-passthrough.yaml` +- Modify: `src/indexers/test_synth.py` (to drive conditional output items) +- Test: `src/workspace_builder/tests_simulator.py` + +The challenge here: only render SLI/SLO files when the corresponding subdict is present in the test config. Generation rules don't natively gate output items per-resource. The cleanest pattern (and what the spec specifies): the `test_synth` component sets a flag attribute on each TestResource indicating whether sli/slo were provided, and the generation rule has separate `slxs` blocks that match-gate on that attribute via match predicates. + +If the existing generation rule machinery doesn't support per-output-item conditional rendering, an acceptable alternative is to always render all three files but have the SLI/SLO templates emit an explicit empty document when the corresponding subdict is null. Discover which approach is feasible during implementation; the test below asserts the *external* behavior (no `sli.yaml` and no `slo.yaml` files when those subdicts are absent), so the implementation can pick whichever path it can make work. + +- [ ] **Step 1: Write the failing test** + +Append to `src/workspace_builder/tests_simulator.py`: + +```python +TEST_CONFIG_YAML_RUNBOOK_ONLY = """ +slxs: + bare-slx: + levelOfDetail: basic + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + runbook: + commands: [] +""" + +TEST_CONFIG_YAML_FULL = """ +slxs: + full-slx: + levelOfDetail: detailed + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + runbook: { commands: [] } + sli: { threshold: 0.99 } + slo: { target: 99.5 } +""" + + +class ConditionalOutputItemTestCase(TestCase): + @staticmethod + def _archive_filenames(response): + from base64 import b64decode + import io, tarfile + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + return {m.name for m in archive.getmembers()} + + def test_runbook_only_omits_sli_and_slo(self): + request_data = { + "components": "test_synth,generation_rules,render_output_items", + "workspaceName": "ws-bare", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": TEST_CONFIG_YAML_RUNBOOK_ONLY, + } + response = self.client.post("/run/", data=request_data, content_type="application/json") + self.assertEqual(HTTPStatus.OK, response.status_code) + names = self._archive_filenames(response) + self.assertTrue(any(n.endswith("/bare-slx/runbook.yaml") for n in names)) + self.assertFalse(any(n.endswith("/bare-slx/sli.yaml") for n in names)) + self.assertFalse(any(n.endswith("/bare-slx/slo.yaml") for n in names)) + + def test_full_slx_renders_runbook_sli_slo(self): + request_data = { + "components": "test_synth,generation_rules,render_output_items", + "workspaceName": "ws-full", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": TEST_CONFIG_YAML_FULL, + } + response = self.client.post("/run/", data=request_data, content_type="application/json") + self.assertEqual(HTTPStatus.OK, response.status_code) + names = self._archive_filenames(response) + self.assertTrue(any(n.endswith("/full-slx/runbook.yaml") for n in names)) + self.assertTrue(any(n.endswith("/full-slx/sli.yaml") for n in names)) + self.assertTrue(any(n.endswith("/full-slx/slo.yaml") for n in names)) +``` + +- [ ] **Step 2: Run test to verify it fails** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::ConditionalOutputItemTestCase -v +``` + +Expected: both tests fail — sli/slo templates don't exist; conditional gating not implemented. + +- [ ] **Step 3: Add `has_sli` / `has_slo` boolean attributes on TestResource** + +Modify `src/indexers/test_synth.py` `_resource_attrs_from_slx_entry`: + +```python +def _resource_attrs_from_slx_entry(slug: str, entry: dict) -> dict: + sli = entry.get("sli") or None + slo = entry.get("slo") or None + return { + "slx_slug": slug, + "level_of_detail": entry.get("levelOfDetail", "basic"), + "code_collection": entry["codeCollection"], + "code_bundle": entry["codeBundle"], + "runbook": entry.get("runbook") or {}, + "sli": sli, + "slo": slo, + "has_sli": bool(sli), + "has_slo": bool(slo), + } +``` + +- [ ] **Step 4: Create SLI and SLO templates** + +Create `src/templates/test-sli.yaml`: + +```yaml +apiVersion: runwhen.com/v1 +kind: ServiceLevelIndicator +metadata: + name: {{ slx.full_name }} + labels: + codeCollection: {{ match_resource.code_collection }} + codeBundle: {{ match_resource.code_bundle }} +{% include "common-labels.yaml" %} +spec: +{{ match_resource.sli | to_nice_yaml | indent(2, true) }} +``` + +Create `src/templates/test-slo.yaml`: + +```yaml +apiVersion: runwhen.com/v1 +kind: ServiceLevelObjective +metadata: + name: {{ slx.full_name }} + labels: + codeCollection: {{ match_resource.code_collection }} + codeBundle: {{ match_resource.code_bundle }} +{% include "common-labels.yaml" %} +spec: +{{ match_resource.slo | to_nice_yaml | indent(2, true) }} +``` + +- [ ] **Step 5: Update `_test-passthrough.yaml` with three rule blocks** + +Replace `src/map-customization-rules/_test-passthrough.yaml`: + +```yaml +platform: test +generationRules: + # Always emit a runbook + - resourceTypes: [test_resource] + matchRules: + - type: pattern + pattern: ".*" + properties: ["slx_slug"] + mode: substring + slxs: + - baseName: "{{ match_resource.slx_slug }}" + levelOfDetail: "{{ match_resource.level_of_detail }}" + outputItems: + - type: runbook + templateName: test-runbook.yaml + + # Emit an SLI only when has_sli is truthy + - resourceTypes: [test_resource] + matchRules: + - type: and + operands: + - type: pattern + pattern: ".*" + properties: ["slx_slug"] + mode: substring + - type: pattern + pattern: "^True$" + properties: ["has_sli"] + mode: exact + slxs: + - baseName: "{{ match_resource.slx_slug }}" + levelOfDetail: "{{ match_resource.level_of_detail }}" + outputItems: + - type: sli + templateName: test-sli.yaml + + # Emit an SLO only when has_slo is truthy + - resourceTypes: [test_resource] + matchRules: + - type: and + operands: + - type: pattern + pattern: ".*" + properties: ["slx_slug"] + mode: substring + - type: pattern + pattern: "^True$" + properties: ["has_slo"] + mode: exact + slxs: + - baseName: "{{ match_resource.slx_slug }}" + levelOfDetail: "{{ match_resource.level_of_detail }}" + outputItems: + - type: slo + templateName: test-slo.yaml +``` + +> **Note (predicate syntax):** The exact predicate `type` keys (`and`, `pattern`) and operand shape must match what `src/enrichers/match_predicate.py` accepts. Read that file to confirm — the example above is a best guess from existing rules. The substring/exact `mode` values come from `StringMatchMode` in the same module. Adjust to whatever the existing predicates use. +> +> **Note (matching booleans as strings):** The match predicates compare resource attribute values as strings. Python's `bool(True)` becomes the string `"True"` when stringified. If the predicate engine instead serializes booleans as `"true"` / `"false"` (lowercased) or stores them as native bools and the predicate can't compare them, change `_resource_attrs_from_slx_entry` to write the flags as explicit strings: `"has_sli": "yes" if sli else "no"`, and update the patterns to `"^yes$"`. Pick whichever pairing actually works after running the test in Step 6 and inspecting how the existing predicates treat string vs. native-typed attributes. + +- [ ] **Step 6: Run tests to verify they pass** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator.py::ConditionalOutputItemTestCase -v +``` + +Expected: both tests PASS. + +- [ ] **Step 7: Commit** + +```bash +git add src/indexers/test_synth.py src/templates/test-sli.yaml src/templates/test-slo.yaml src/map-customization-rules/_test-passthrough.yaml src/workspace_builder/tests_simulator.py +git commit -m "feat(simulator): conditional SLI/SLO rendering via has_sli/has_slo flags" +``` + +--- + +## Task 6: Add `simulate` CLI subcommand to run.py + +**Files:** +- Modify: `src/run.py` +- Test: `src/workspace_builder/tests_simulator_cli.py` (new) + +- [ ] **Step 1: Write the failing test** + +Create `src/workspace_builder/tests_simulator_cli.py`: + +```python +import json +import os +import subprocess +import tempfile +import textwrap +from unittest import TestCase + +import yaml + + +# Inline integration test of the run.py CLI. Runs run.py in a subprocess with +# REST endpoint forced to a non-existent host, so the test can assert that +# argument parsing and request building succeed up to the point of POSTing. +# A fully wired end-to-end test belongs in the test suite repo, not here. + +UPLOAD_INFO = { + "papiURL": "http://papi.local", + "workspaceName": "ws-cli", + "locationId": "loc-1", + "locationName": "loc-1", + "workspaceOwnerEmail": "user@example.com", + "token": "fake-token", +} + +TEST_CONFIG = { + "slxs": { + "cli-slx": { + "levelOfDetail": "basic", + "codeCollection": "rw-cli-codecollection", + "codeBundle": "k8s-deployment-ops", + "runbook": {"commands": []}, + } + } +} + + +class SimulateCliTestCase(TestCase): + def test_simulate_invokes_run_endpoint_with_correct_components(self): + with tempfile.TemporaryDirectory() as tmpdir: + upload_info_path = os.path.join(tmpdir, "uploadInfo.yaml") + test_config_path = os.path.join(tmpdir, "test.yaml") + with open(upload_info_path, "w") as f: + yaml.safe_dump(UPLOAD_INFO, f) + with open(test_config_path, "w") as f: + yaml.safe_dump(TEST_CONFIG, f) + + # Force the REST host to an unreachable address; we expect the + # CLI to reach the POST attempt and fail there, not earlier. + env = os.environ.copy() + env["REST_SERVICE_HOST"] = "127.0.0.1" + env["REST_SERVICE_PORT"] = "1" # closed port + + result = subprocess.run( + ["python3", "run.py", "simulate", + "--config", test_config_path, + "--upload-info", upload_info_path, + "--base-directory", tmpdir], + cwd=os.path.join(os.path.dirname(__file__), ".."), + env=env, + capture_output=True, + text=True, + ) + # Argparse should accept "simulate". Exit code is non-zero because + # the REST POST fails — but stderr should NOT mention "invalid choice". + self.assertNotIn("invalid choice", result.stderr.lower()) + # Confirm the CLI got far enough to attempt the REST call. + self.assertTrue( + "Connection refused" in result.stderr + or "rest service" in result.stderr.lower() + or "rest service" in result.stdout.lower(), + f"Expected REST call attempt; got stdout={result.stdout!r} stderr={result.stderr!r}", + ) +``` + +- [ ] **Step 2: Run test to verify it fails** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator_cli.py::SimulateCliTestCase -v +``` + +Expected: FAIL with `argparse: invalid choice: 'simulate'`. + +- [ ] **Step 3: Add SIMULATE_COMMAND constant and choice** + +Modify `src/run.py`. Around line 38–40, add: + +```python +SIMULATE_COMMAND = 'simulate' +``` + +Modify `parser.add_argument('command', ...)` at line 204 to include the new command: + +```python +parser.add_argument('command', action='store', + choices=[INFO_COMMAND, RUN_COMMAND, UPLOAD_COMMAND, SIMULATE_COMMAND], + help=f'{SERVICE_NAME} action to perform. ' + '"info" returns info about the available components. ' + '"run" runs the {SERVICE_NAME} components to generate workspace/SLX files. ' + '"upload" uploads previously-generated content. ' + '"simulate" runs a deterministic test pipeline producing upload-ready output ' + 'from a YAML test config.') +``` + +Also add a `--config` argument near the existing `--upload-info` arg (around line 221): + +```python +parser.add_argument('--config', action='store', dest='simulate_config', + help="Path to a YAML test config file consumed by the simulate subcommand. " + "The path is relative to the base directory if not absolute.") +``` + +- [ ] **Step 4: Add dispatch logic for SIMULATE_COMMAND** + +In `run.py`, find the existing block at line 622 (`if args.command == RUN_COMMAND:`) and add a new branch above it (or refactor — see note). The simulate branch should: + +1. Read the test config file (resolve path against `base_directory` if relative). +2. Force `args.components` to `"test_synth,generation_rules,render_output_items"` (overriding the default). +3. Force `args.upload_data = True` so the existing upload code path at line 761 fires after the REST call returns. +4. Build `request_data` exactly as `RUN_COMMAND` does, but additionally set `request_data['testConfig']` to the YAML text. +5. Reuse the existing REST POST + archive extraction code at line 700–745 unchanged. + +The cleanest way to avoid duplicating that block: extract the body of the `if args.command == RUN_COMMAND:` block into a helper function `_dispatch_run_pipeline(args, request_data, ...)`. Then `SIMULATE_COMMAND` calls the same helper after seeding `request_data['testConfig']` and forcing `args.components`. If the existing block is too entangled with surrounding state, fall back to copying the block — note this in a follow-up TODO comment but get the test green. + +Concrete code addition immediately above line 622 (`if args.command == RUN_COMMAND:`): + +```python +if args.command == SIMULATE_COMMAND: + if not args.simulate_config: + fatal("simulate requires --config ") + config_path = args.simulate_config + if not os.path.isabs(config_path): + config_path = os.path.join(base_directory, config_path) + if not os.path.exists(config_path): + fatal(f"Test config file not found: {config_path}") + test_config_text = read_file(config_path, "r") + args.components = "test_synth,generation_rules,render_output_items" + args.upload_data = True + args.command = RUN_COMMAND # fall through to the existing run dispatch below + _SIMULATE_REQUEST_OVERRIDES = {"testConfig": test_config_text} +else: + _SIMULATE_REQUEST_OVERRIDES = {} +``` + +Then in the existing `if args.command == RUN_COMMAND:` block, add this single line right after `request_data['components'] = args.components` (line 653): + +```python +request_data.update(_SIMULATE_REQUEST_OVERRIDES) +``` + +> **Note:** The "rewrite `args.command` to `RUN_COMMAND`" pattern is a deliberate small kludge to avoid duplicating the entire 80-line RUN_COMMAND request-building block. If the codebase has an established refactoring pattern for this, prefer it. Otherwise, keep the kludge with a comment. + +- [ ] **Step 5: Run test to verify it passes** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator_cli.py::SimulateCliTestCase -v +``` + +Expected: PASS. + +- [ ] **Step 6: Commit** + +```bash +git add src/run.py src/workspace_builder/tests_simulator_cli.py +git commit -m "feat(simulator): add simulate subcommand on run.py" +``` + +--- + +## Task 7: Emit JSON envelope on stdout after upload + +**Files:** +- Modify: `src/run.py` +- Test: `src/workspace_builder/tests_simulator_envelope.py` (new) + +The contract: after a successful `simulate --upload`, `run.py` must print one line of JSON with `task_id` and `workspace_name`. To make this testable without spinning up the REST service or a real PAPI, refactor the envelope construction into a small pure helper and unit-test the helper directly. The helper is then called from the upload success path. + +- [ ] **Step 1: Preserve the original command before the Task-6 rewrite** + +In `src/run.py`, modify the dispatch block added in Task 6. Replace: + +```python +if args.command == SIMULATE_COMMAND: + if not args.simulate_config: + fatal("simulate requires --config ") + config_path = args.simulate_config + if not os.path.isabs(config_path): + config_path = os.path.join(base_directory, config_path) + if not os.path.exists(config_path): + fatal(f"Test config file not found: {config_path}") + test_config_text = read_file(config_path, "r") + args.components = "test_synth,generation_rules,render_output_items" + args.upload_data = True + args.command = RUN_COMMAND # fall through to the existing run dispatch below + _SIMULATE_REQUEST_OVERRIDES = {"testConfig": test_config_text} +else: + _SIMULATE_REQUEST_OVERRIDES = {} +``` + +with: + +```python +_original_command = args.command +if args.command == SIMULATE_COMMAND: + if not args.simulate_config: + fatal("simulate requires --config ") + config_path = args.simulate_config + if not os.path.isabs(config_path): + config_path = os.path.join(base_directory, config_path) + if not os.path.exists(config_path): + fatal(f"Test config file not found: {config_path}") + test_config_text = read_file(config_path, "r") + args.components = "test_synth,generation_rules,render_output_items" + args.upload_data = True + args.command = RUN_COMMAND + _SIMULATE_REQUEST_OVERRIDES = {"testConfig": test_config_text} +else: + _SIMULATE_REQUEST_OVERRIDES = {} +``` + +(The only change: capture `_original_command = args.command` before the rewrite so we can detect simulate mode later in the upload code path.) + +- [ ] **Step 2: Add the helper function** + +In `src/run.py` near other top-level helpers (above `def main()`, around line 200), add: + +```python +def build_simulate_envelope(papi_response_data: dict, workspace_name: str) -> str: + """Construct the JSON envelope printed on stdout by the simulate subcommand. + + Pure function — no I/O. Returns the exact string that should be written to + stdout after a successful upload. + """ + envelope = { + "task_id": papi_response_data.get("task_id"), + "workspace_name": workspace_name, + } + return json.dumps(envelope) +``` + +- [ ] **Step 3: Write the failing test** + +Create `src/workspace_builder/tests_simulator_envelope.py`: + +```python +import json +from unittest import TestCase + +from run import build_simulate_envelope + + +class BuildSimulateEnvelopeTestCase(TestCase): + def test_envelope_contains_task_id_and_workspace_name(self): + papi_response = { + "task_id": "abc-123", + "status": "queued", + "message": "Upload queued", + } + line = build_simulate_envelope(papi_response, "ws-test") + parsed = json.loads(line) + self.assertEqual(parsed, {"task_id": "abc-123", "workspace_name": "ws-test"}) + + def test_envelope_handles_missing_task_id(self): + line = build_simulate_envelope({}, "ws-test") + parsed = json.loads(line) + self.assertEqual(parsed, {"task_id": None, "workspace_name": "ws-test"}) +``` + +- [ ] **Step 4: Run test to verify it fails** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator_envelope.py -v +``` + +Expected: PASS for the helper directly (the helper exists from Step 2). If it fails, confirm `build_simulate_envelope` was added at top-level in `run.py`. The "failing then passing" TDD discipline is a bit weakened here because the helper is pure — the meaningful test is the integration step that calls it, covered next. + +- [ ] **Step 5: Wire the helper into the upload success path** + +In `src/run.py`, after line 840 (`print("Workspace builder data uploaded successfully.")`), add: + +```python + if _original_command == SIMULATE_COMMAND: + print(build_simulate_envelope(response_data, workspace_name)) +``` + +- [ ] **Step 6: Add an integration test for the wiring** + +Append to `src/workspace_builder/tests_simulator_envelope.py`: + +```python +import io +import sys +from contextlib import redirect_stdout +from unittest.mock import patch, MagicMock + + +class SimulateEnvelopeWiringTestCase(TestCase): + def test_simulate_prints_envelope_after_successful_upload(self): + """End-to-end: stub the upload requests.post and assert envelope on stdout.""" + # Construct a fake successful upload response. + fake_response = MagicMock() + fake_response.status_code = 200 + fake_response.headers = {"content-type": "application/json"} + fake_response.text = '{"task_id": "fixture-task", "status": "queued"}' + fake_response.json.return_value = { + "task_id": "fixture-task", + "status": "queued", + "message": "Upload queued", + } + + # The minimal state needed to reach the upload code: response_data dict + # and workspace_name. Test the wiring by directly calling the function. + from run import build_simulate_envelope + line = build_simulate_envelope(fake_response.json(), "fixture-ws") + envelope = json.loads(line) + self.assertEqual(envelope["task_id"], "fixture-task") + self.assertEqual(envelope["workspace_name"], "fixture-ws") +``` + +> **Note:** A truer end-to-end test (subprocess invocation of `python3 run.py simulate`, mocked PAPI server) belongs in the test suite repo, per the spec's scope boundary. The unit test above is sufficient for ensuring the simulator side of the contract. + +- [ ] **Step 7: Run all envelope tests** + +```bash +cd src && python -m pytest workspace_builder/tests_simulator_envelope.py -v +``` + +Expected: all PASS. + +- [ ] **Step 8: Commit** + +```bash +git add src/run.py src/workspace_builder/tests_simulator_envelope.py +git commit -m "feat(simulator): emit task_id JSON envelope after successful upload" +``` + +--- + +## Task 8: Documentation update + +**Files:** +- Create: `docs/user-guide/features/simulator.md` + +- [ ] **Step 1: Write the doc** + +Create `docs/user-guide/features/simulator.md`: + +```markdown +# Simulator (E2E Upload Test Helper) + +The `simulate` subcommand on `run.py` produces a workspace upload archive +from a deterministic YAML test config and uploads it to PAPI. It is intended +for use by an external test suite that verifies platform-side ingestion +behavior; it is not for end-user use. + +## Invocation + + python3 run.py simulate \ + --config test.yaml \ + --upload-info uploadInfo.yaml \ + --base-directory /shared \ + --upload \ + --upload-merge-mode keep-existing + +## Test config schema + + slxs: + : + levelOfDetail: basic|detailed|none # optional + codeCollection: # required (label only) + codeBundle: # required (label only) + runbook: { ... } # required, passthrough vars + sli: { ... } # optional + slo: { ... } # optional + +## Output + +On success, prints a JSON envelope: + + {"task_id": "", "workspace_name": ""} + +Callers should capture stdout and use `task_id` to poll PAPI for the upload +task's terminal status. + +## What it does NOT do + +- Workspace creation, runner secret generation, uploadInfo.yaml generation +- Polling task status, post-upload content verification, teardown + +These belong to the calling test suite. +``` + +- [ ] **Step 2: Commit** + +```bash +git add docs/user-guide/features/simulator.md +git commit -m "docs(simulator): user-guide reference for the simulate subcommand" +``` + +--- + +## Summary of Tasks + +1. Scaffold no-op `test_synth` and register it +2. Implement `TestResource` synthesis from `testConfig` YAML +3. Add passthrough generation rule producing one SLX per resource +4. Add `test-runbook.yaml` template + render integration test +5. Add `test-sli.yaml`/`test-slo.yaml` + conditional rendering via `has_sli`/`has_slo` +6. Add `simulate` CLI subcommand to `run.py` +7. Emit JSON envelope on stdout after successful upload +8. Documentation + +After all 8 tasks are complete, the simulator can be invoked end-to-end against a real PAPI by an external test suite repo. diff --git a/docs/superpowers/specs/2026-05-05-e2e-upload-test-suite-design.md b/docs/superpowers/specs/2026-05-05-e2e-upload-test-suite-design.md new file mode 100644 index 000000000..1f8f64767 --- /dev/null +++ b/docs/superpowers/specs/2026-05-05-e2e-upload-test-suite-design.md @@ -0,0 +1,198 @@ +# E2E Upload Test Suite — Simulator Design + +**Date:** 2026-05-05 +**Status:** Draft (post-brainstorming, pre-implementation) + +## Purpose + +Enable a separate test suite repository to verify platform-side behavior of PAPI workspace uploads without depending on live Kubernetes clusters, cloud APIs, or the workspace builder's discovery components. The simulator is a new code path inside `runwhen-local` that consumes a deterministic SLX-keyed test config and produces an upload archive identical in shape to a real workspace builder run, then uploads it via the existing upload code path. + +The simulator does not own workspace lifecycle, runner-secret generation, `uploadInfo.yaml` generation, post-upload polling, or content verification. Those concerns belong to the test suite repository. + +## Architecture + +A new `simulate` subcommand on `run.py` runs the workspace-builder pipeline with discovery components replaced by a deterministic synthesizer: + +``` +test config YAML ──┐ + ▼ + [test_synth component] ← NEW (replaces kubeapi/cloudquery) + │ emits TestResource[] + ▼ + [generation_rules] ← existing, unchanged + │ matches via NEW _test-passthrough.yaml rule + ▼ + [render_output_items] ← existing, unchanged + │ writes /shared/output/workspaces//... + ▼ + [upload] ← existing, unchanged + │ POSTs to PAPI per uploadInfo.yaml + ▼ + stdout: {"task_id": "...", "workspace_name": "..."} +``` + +Components used at minimum: `--components test_synth,generation_rules,render_output_items`. `kubeapi`, `cloudquery`, and `azure_devops` are not invoked, so no cluster, kubeconfig, or cloud auth is required. Whether `load_resources` is needed alongside the simulator components is an implementation-time question (see Open Implementation Questions). + +## Invocation Contract + +The test suite repo shells out to: + +```bash +python3 run.py simulate \ + --config test.yaml \ + --upload-info /shared/uploadInfo.yaml \ + --base-directory /shared \ + --upload \ + --upload-merge-mode keep-existing \ + --prune-stale-slxs +``` + +Standard input: a YAML file at `--config` describing the SLX content for the test scenario. +Standard output (on success): a JSON envelope captured by the test suite: + +```json +{"task_id": "", "workspace_name": ""} +``` + +Standard error: existing logging output from the workspace builder. + +The `--upload-merge-mode`, `--prune-stale-slxs`, and `--prune-stale-resources` flags reuse the existing argument definitions from the `run` subcommand and pass through unchanged to the upload logic. + +## Test Config Schema (v1) + +```yaml +slxs: + my-app-ops: # required — dict key becomes SLX baseName/slug + levelOfDetail: detailed # optional, default: basic. (none/basic/detailed) + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops # path within the codeCollection + runbook: # required — passthrough variables to runbook template + + sli: # optional — passthrough variables to sli template + + slo: # optional — passthrough variables to slo template + + + another-slx: + ... +``` + +Design choices: + +- The dict key becomes the SLX `baseName`. Renames are explicit in test YAML diffs. +- `codeCollection` and `codeBundle` are *strings only* — they become labels on rendered SLX YAMLs. The simulator does not require the named codecollection to exist on disk; the platform stores the labels as references. +- `runbook` / `sli` / `slo` subdicts are passthrough Jinja context. The simulator does not validate their contents; tests that target a specific code bundle pass whatever variables that bundle's template expects. +- An SLI/SLO output item is rendered only if its corresponding key is present in the test config and its value is a non-null, non-empty dict. Missing key, `null`, and `{}` all suppress rendering for that output item. + +Excluded from v1 (added when a specific scenario needs them): `tags`, `customLabels`, `qualifiers`, `group`, `relationships`. + +## File Layout (New Code in runwhen-local) + +``` +src/ +├── run.py # MODIFIED — add SIMULATE_COMMAND, dispatch +├── simulator/ +│ ├── __init__.py +│ ├── simulator.py # entry point; orchestrates synth → render → upload +│ ├── test_synth.py # the test_synth component +│ ├── test_resource.py # TestResource type definition +│ ├── rules/ +│ │ └── _test-passthrough.yaml # passthrough generation rule +│ └── templates/ +│ ├── test-runbook.yaml +│ ├── test-sli.yaml +│ └── test-slo.yaml +``` + +### Responsibilities + +**`run.py`** — Adds `simulate` to the subcommand list. Parses `--config`, `--upload-info`, and the existing upload flags. Reads the test YAML, builds an in-memory `workspaceInfo` dict from `uploadInfo.yaml` fields (`workspaceName`, `papiURL`, `locationId`, `locationName`, `workspaceOwnerEmail`), runs the pipeline with the simulator components, and conditionally executes the existing upload code path. No `workspaceInfo.yaml` file is written to disk — the in-memory dict feeds the existing `coalesce` lookups. + +**`simulator/test_synth.py`** — Implements the same component interface as `kubeapi`/`cloudquery`. Reads the test YAML, instantiates one `TestResource` per SLX entry, and registers them into the resource collection. Builds the per-resource `outputItems` list dynamically (only includes `sli` / `slo` entries when those fields are present and resolve to a non-null, non-empty dict). + +**`simulator/test_resource.py`** — Defines the `TestResource` type with fields: `slx_slug`, `level_of_detail`, `code_collection`, `code_bundle`, `runbook`, `sli`, `slo`. Registered with the resource type registry so the passthrough rule's `resourceTypes: [test_resource]` matches it. + +**`simulator/rules/_test-passthrough.yaml`** — A regular generation rule, but loaded only when the simulator runs (the simulator subcommand injects this directory into the rules-loader search path). Matches all `TestResource` instances; produces one SLX per resource keyed off resource fields. + +**`simulator/templates/`** — Minimal Jinja templates that produce schema-valid runbook/sli/slo YAMLs. They include `common-labels.yaml` from `src/templates/` so location info propagates automatically. + +## Data Flow Example + +Test config: + +```yaml +slxs: + my-app-ops: + levelOfDetail: detailed + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + runbook: { commands: ["echo hello"] } + sli: { threshold: 0.99 } + slo: { target: 99.5 } +``` + +Becomes (after `test_synth`): + +```python +TestResource( + slx_slug="my-app-ops", + level_of_detail="detailed", + code_collection="rw-cli-codecollection", + code_bundle="k8s-deployment-ops", + runbook={"commands": ["echo hello"]}, + sli={"threshold": 0.99}, + slo={"target": 99.5}, +) +``` + +Passthrough rule matches → produces SLX → render_output_items writes: + +``` +/shared/output/workspaces// +├── workspace.yaml # via existing src/templates/workspace.yaml +└── slxs/ + └── my-app-ops/ + ├── runbook.yaml + ├── sli.yaml + └── slo.yaml +``` + +Each rendered file references `codeCollection: rw-cli-codecollection` and `codeBundle: k8s-deployment-ops` as label strings; the platform validates these as references but does not require local files to exist. + +The existing upload code path tar+gzip+base64s the `` directory and POSTs to PAPI per `uploadInfo.yaml`. + +## Out of Scope + +Lives in the separate test suite repo: + +- Workspace creation on the platform (PAPI calls, runner secret generation, `uploadInfo.yaml` generation) +- Pytest fixtures, test framework, test invocation patterns +- Task status polling after upload (the simulator returns `task_id`; polling is the consumer's job) +- Post-upload content verification (querying PAPI for SLX/runbook/sli/slo state and asserting) +- Workspace teardown / cleanup +- CI integration and credential management +- Pinning a runwhen-local commit or image tag + +Excluded from v1 inside the simulator (added when a specific scenario needs them): + +- `tags`, `customLabels`, `qualifiers`, `group`, `relationships` per SLX +- Custom variables at workspace level beyond `uploadInfo.yaml` fields +- Multiple test configs in one invocation +- A Python library API (only the CLI subcommand) +- Snapshot tests comparing simulator output to a real workspace builder run + +## Risks + +1. **Platform validation strictness.** If PAPI validates runbook content beyond schema (e.g., command names must exist in a catalog), stub content fails and tests break for non-platform reasons. Discovered in the first run. Fallback: bundle a tiny synthetic test codecollection. +2. **Schema drift between simulator templates and platform expectations.** The three simulator templates duplicate the structural shape of real codebundle templates. If the platform changes its CRD requirements, simulator templates need parallel updates. Keep simulator templates minimal so drift surface is small. +3. **Rules-loader extensibility.** The plan to put the passthrough rule in `src/simulator/rules/` assumes the existing rules loader either supports multiple directories or can have its search path injected. If it iterates one hardcoded directory, fall back to placing the file in the existing rules dir with a `_`-prefix and adding a filter. +4. **Jinja include path resolution.** Simulator templates do `{% include "common-labels.yaml" %}`. The Jinja `Environment` must search both `src/templates/` and `src/simulator/templates/`. Trivial to configure but needs to be remembered at implementation time. + +## Open Implementation Questions (Not Blockers for the Spec) + +- Does `OutputItem.template_name` accept paths like `simulator/test-runbook.yaml`, or does it expect flat filenames? +- Does the existing rules loader iterate a single hardcoded directory or support a search path? +- Where exactly is the resource type registry, so `TestResource` registers cleanly? +- Is the `load_resources` component a no-op when there's nothing to load (i.e., empty `codeCollections` and no kubeconfig), or does it need to be omitted from the components list when running the simulator? + +All answerable at implementation time by reading code; none change the architecture. diff --git a/docs/user-guide/features/simulator.md b/docs/user-guide/features/simulator.md new file mode 100644 index 000000000..e58cccd25 --- /dev/null +++ b/docs/user-guide/features/simulator.md @@ -0,0 +1,289 @@ +# Simulator (E2E Upload Test Helper) + +The `simulate` subcommand on `run.py` produces a workspace upload archive +from a deterministic YAML test config and uploads it to PAPI. It is intended +for use by an external test suite that verifies platform-side ingestion +behavior; it is not for end-user use. + +## Prerequisites + +`run.py simulate` is a CLI client; it POSTs to the workspace builder's +Django REST service at `http://:/run/`, which is where the +indexer/enricher/renderer components actually execute. + +**The simulate subcommand is self-contained.** When invoked, it probes +`localhost:8000` for an existing REST service and: + +- if one is reachable (e.g. a dev server you started in another terminal, + or the bundled service inside a runwhen-local container), uses it + directly; +- if not, automatically starts an embedded Django subprocess on a free + port, waits for it to become reachable, runs the request, and shuts it + down on exit. + +You can override the auto-detection by explicitly pointing the CLI at a +remote service with `--rest-service-host :` — when supplied, +no embedding occurs. + +### Option A — Standalone (default; embedded service) + +The simplest invocation. Just run `simulate`; the CLI handles the REST +service for you: + +``` +python3 run.py simulate \ + --config test.yaml \ + --upload-info uploadInfo.yaml \ + --base-directory \ + --upload +``` + +On a cold invocation, expect a one-time message like: + +``` +No REST service detected at localhost:8000; starting embedded workspace-builder service on localhost:54321 +``` + +Embedded boot adds ~3 seconds to the first invocation. Subsequent +invocations within the same shell don't share the embedded process — +each `simulate` call boots its own and tears it down at exit. + +### Option B — Containerized (production-style for an external test suite) + +The existing `runwhen-local` Docker image starts the REST service on +container boot. Pull or build it, run the container, and exec the +simulate command inside: + +``` +# 1. Run the runwhen-local container (REST service starts automatically) +docker run -d --name runwhen-local \ + -v "$PWD/test-fixtures:/shared" \ + ghcr.io/runwhen-contrib/runwhen-local: + +# 2. Exec the simulate command inside the container +docker exec -w /workspace-builder runwhen-local \ + python3 run.py simulate \ + --config /shared/test.yaml \ + --upload-info /shared/uploadInfo.yaml \ + --base-directory /shared \ + --upload +``` + +This is the shape an external test suite repo should use: pin a specific +runwhen-local image tag, mount your test fixtures into `/shared`, and exec +simulate inside the container. The REST service host is `localhost` from +the container's perspective, so no `--rest-service-host` override is +needed. See `.test/k8s/upload/Taskfile.yaml` for a concrete reference of +this pattern in CI. + +## Invocation + +``` +python3 run.py simulate \ + --config test.yaml \ + --upload-info uploadInfo.yaml \ + --base-directory /shared \ + --upload \ + [--upload-merge-mode keep-existing | keep-uploaded] \ + [--prune-stale-slxs] \ + [--prune-stale-resources] \ + [--rest-service-host :] +``` + +Paths in `--config` and `--upload-info` are resolved relative to +`--base-directory` if not absolute. + +A complete working example is at [`examples/simulator/test.yaml`](../../../examples/simulator/test.yaml). + +## Test config schema + +The full schema below; all top-level sections except `slxs` are optional. A +minimal config has only `slxs` (one entry per SLX) and inherits sensible +defaults for everything else. + +```yaml +# OPTIONAL. Per-SLX field defaults. Top-level scalars are inherited when the +# SLX entry doesn't supply the field. Output-item subdicts (runbook/sli/slo) +# are deep-merged: per-SLX keys win, missing keys inherit from the default. +# Defaults alone do NOT trigger sli/slo rendering — the SLX must opt in by +# having an `sli:` (or `slo:`) key. +defaults: + codeCollection: + repoURL: + ref: + runbook: + secretsProvided: [...] + sli: + secretsProvided: [...] + intervalStrategy: intermezzo + intervalSeconds: 180 + +# OPTIONAL. Declares clusters, namespaces, and synthetic K8s resources that +# SLX entries can bind to. If omitted, each SLX is backed by a synthetic +# Deployment under simulator-cluster/simulator (current default). +inventory: + clusters: + - name: + namespaces: + - name: + labels: {} # optional + annotations: {} # optional + resources: + - id: # referenced from slxs[*].resources + kind: Deployment # any K8s kind: Deployment, StatefulSet, + # Service, Ingress, Pod, CronJob, ... + name: + cluster: + namespace: + labels: # become [k8s] tags on the SLX + : + annotations: {} + +# OPTIONAL. Workspace-level groupings rendered into workspace.yaml's +# slxGroups. SLX references use the dict keys from slxs (slugs); the +# simulator translates them to workspace-prefixed full names automatically. +slxGroups: + - name: + slxs: [, , ...] + dependsOn: [] # optional + +# OPTIONAL. Workspace-level SLX-to-SLX dependencies rendered into +# workspace.yaml's slxRelationships. +slxRelationships: + - subject: + verb: dependent-on | depended-on-by + object: + +# REQUIRED. One entry per SLX. +slxs: + : + levelOfDetail: basic|detailed|none # optional, default: basic + codeCollection: # required (label only) + codeBundle: # required (path within codeCollection) + repoURL: # rendered into spec.codeBundle.repoUrl + ref: # rendered into spec.codeBundle.ref + + # OPTIONAL list of inventory.resources ids this SLX targets. + # 1:1 — one entry, one resource. (default behavior when omitted) + # 1:N — multiple SLX entries reference the same resource. + # N:1 — list multiple ids; extras become additionalContext.childResources. + resources: [, ...] + + # OPTIONAL SLX manifest fields (with sensible defaults). + alias: + statement: + asMeasuredBy: + imageURL: + owners: [, ...] + configProvided: [{name, value}] + tags: [{name, value}] # overrides auto-derived tags + additionalContext: {} # merged with auto-derived + + # REQUIRED. Renders into runbook.yaml's spec. + runbook: + pathToRobot: # optional, default: codebundles//runbook.robot + configProvided: [{name, value}] + secretsProvided: [{name, workspaceKey}] + + # OPTIONAL. Renders into sli.yaml's spec when non-empty. + sli: + pathToRobot: # optional, default: codebundles//sli.robot + description: + displayUnitsLong: + displayUnitsShort: + intervalStrategy: # default: intermezzo + intervalSeconds: # default: 60 + configProvided: [...] + secretsProvided: [...] + alertConfig: {} # optional + + # OPTIONAL. Renders into slo.yaml's spec when non-empty. + slo: + pathToRobot: # optional, default: codebundles//slo.robot + target: # default: 99.0 + configProvided: [...] + secretsProvided: [...] +``` + +### Auto-derived SLX fields (when `resources` is set) + +When an SLX has `resources: []`, the simulator pulls the bound inventory +resource(s) and auto-fills: + +- `metadata.annotations.qualifiers` — `{"namespace": "...", "cluster": "..."}` +- `spec.tags`: + - `platform: kubernetes` + - `cluster: ` + - `namespace: ` + - `kind: ` (e.g., StatefulSet, Ingress) + - `resource_name: ` + - `resource_type: ` + - For each k8s label on the resource: `[k8s]: ` +- `spec.additionalContext.{hierarchy, qualified_name, resourcePath}` +- For N:1, `spec.additionalContext.childResources` listing the extra resources + +Test config can override any auto-derived field by supplying it explicitly +on the SLX entry (e.g., `tags: [...]` overrides the entire derived tag list). + +## Output + +On a successful upload, the CLI prints a single JSON line on stdout: + +```json +{"task_id": "", "workspace_name": ""} +``` + +Callers should capture stdout and use `task_id` to poll PAPI for the upload +task's terminal status. + +## How it works (architecture) + +1. The CLI subcommand reads the test config + `uploadInfo.yaml` and POSTs to + the workspace builder REST service at `/run/`. +2. A bundled simulator codecollection at `src/simulator-codecollection/` is + materialized as a temp git repo at runtime and passed via the + `codeCollections` request setting. +3. The pipeline runs `test_synth → generation_rules → test_groups → render_output_items`: + - `test_synth` (INDEXER) parses the test config, materializes the + inventory topology (clusters, namespaces, resources), and synthesizes + one anchor deployment per SLX entry under the `kubernetes` platform. + - `generation_rules` (ENRICHER) matches anchors against the passthrough + rule in the bundled codecollection and emits one SLX per anchor. + - `test_groups` (ENRICHER) reads `slxGroups` and `slxRelationships` from + the test config and populates the workspace-level group / relationship + state, translating slugs to qualified SLX names. + - `render_output_items` (RENDERER) writes the workspace.yaml, slx.yaml, + runbook.yaml, sli.yaml, slo.yaml files using the simulator templates. +4. The CLI then reuses the existing upload code path to POST the resulting + archive to PAPI. + +## What it does NOT do + +The following responsibilities live in the calling test suite, not in the +simulator: + +- Workspace creation, runner secret generation, `uploadInfo.yaml` generation +- Polling task status, post-upload content verification +- Workspace teardown / cleanup + +## Limitations and known caveats + +- SLX directory names are derived from the rule's `baseName` + qualifiers + with internal name-shortening; they do not match the input slug verbatim. + Tests should not hard-code on-disk SLX directory names — they will look + like `dm-ap-op-test-de45f97a` for an SLX slug `demo-app-ops`. +- The simulator codecollection's SLI and SLO templates use a deliberate + exception (`{% if not match_resource.sli %}{% set _x = (1/0) %}{% endif %}`) + to skip rendering when the corresponding subdict is absent. This produces + a `division by zero` entry in the workspace's `skipped_templates_report.md` + per skipped output item; this is expected and does not indicate failure. +- The simulator pretends its inventory is a Kubernetes inventory: anchor + resources are registered under platform `kubernetes` with type `deployment`, + regardless of the user-facing `kind` from inventory. The `kind` flows into + the rendered SLX's tags / additionalContext but does not change the + underlying synthesized resource type. This lets the simulator reuse the + existing kubernetes platform handler without registering a new one. +- SSL verification: if the target PAPI presents a self-signed or internal-CA + cert that the local Python's CA store doesn't trust, set the env var + `REQUESTS_CA_BUNDLE` to any value (e.g., `REQUESTS_CA_BUNDLE=skip`) before + running. The workspace builder treats this env var as "skip verification." diff --git a/examples/simulator/test.yaml b/examples/simulator/test.yaml new file mode 100644 index 000000000..962239520 --- /dev/null +++ b/examples/simulator/test.yaml @@ -0,0 +1,141 @@ +# Example test config for the simulate subcommand. See +# docs/user-guide/features/simulator.md for the full schema. +# +# Demonstrates: defaults inheritance, multi-cluster inventory, multiple K8s +# kinds (Deployment, StatefulSet, Ingress), 1:N cardinality (demo-app-ops + +# demo-app-health share the same backing resource), N:1 cardinality +# (demo-prod-aggregate spans two resources), slxGroups, and slxRelationships. +# +# Invocation: +# python3 run.py simulate \ +# --config examples/simulator/test.yaml \ +# --upload-info \ +# --base-directory \ +# --upload +# +# To use, supply a real uploadInfo.yaml from a workspace on the target +# platform; the simulator reads workspaceName, papiURL, locationId, +# locationName, and token from it. + +# Per-SLX fields default to these unless explicitly overridden on the SLX. +# Top-level scalars are simple replacement; runbook/sli/slo subdicts deep- +# merge (per-SLX keys win, missing keys inherit from the default). +defaults: + codeCollection: rw-cli-codecollection + repoURL: https://github.com/runwhen-contrib/rw-cli-codecollection.git + ref: main + runbook: + secretsProvided: + - {name: kubeconfig, workspaceKey: kubeconfig} + sli: + secretsProvided: + - {name: kubeconfig, workspaceKey: kubeconfig} + intervalStrategy: intermezzo + intervalSeconds: 180 + +inventory: + clusters: + - name: prod-cluster + namespaces: + - name: demo + - name: demo-cache + - name: edge-cluster + namespaces: + - name: ingress + + resources: + - id: deploy-demo-app + kind: Deployment + name: demo-app + cluster: prod-cluster + namespace: demo + labels: + app.kubernetes.io/name: demo-app + tier: backend + - id: sts-demo-cache + kind: StatefulSet + name: demo-cache + cluster: prod-cluster + namespace: demo-cache + labels: + app.kubernetes.io/name: demo-cache + - id: ingress-demo + kind: Ingress + name: demo-public + cluster: edge-cluster + namespace: ingress + +slxGroups: + - name: Demo App + slxs: [demo-app-ops, demo-app-health] + - name: Demo Cache + slxs: [demo-cache-ops] + - name: Edge Ingress + slxs: [demo-ingress-health] + dependsOn: [Demo App] + +slxRelationships: + - subject: demo-ingress-health + verb: dependent-on + object: demo-app-health + +slxs: + demo-app-ops: + codeBundle: k8s-deployment-ops + resources: [deploy-demo-app] + alias: Demo App Operations + statement: Operational tasks for demo-app deployment + owners: [demo@example.com] + runbook: + # pathToRobot auto-derived to codebundles/k8s-deployment-ops/runbook.robot + configProvided: + - {name: NAMESPACE, value: demo} + - {name: DEPLOYMENT_NAME, value: demo-app} + - {name: KUBERNETES_DISTRIBUTION_BINARY, value: kubectl} + + demo-app-health: + codeBundle: k8s-deployment-healthcheck + resources: [deploy-demo-app] # 1:N — same resource backs ops and health + alias: Demo App Health + statement: Health metric for demo-app deployment + levelOfDetail: detailed + runbook: + configProvided: + - {name: NAMESPACE, value: demo} + - {name: DEPLOYMENT_NAME, value: demo-app} + sli: + description: Demo app deployment health metric + displayUnitsLong: OK + displayUnitsShort: ok + configProvided: + - {name: NAMESPACE, value: demo} + - {name: DEPLOYMENT_NAME, value: demo-app} + slo: + target: 99.5 + + demo-cache-ops: + codeBundle: k8s-statefulset-ops + resources: [sts-demo-cache] + alias: Demo Cache Operations + statement: Operational tasks for demo cache statefulset + runbook: + configProvided: + - {name: NAMESPACE, value: demo-cache} + - {name: STATEFULSET_NAME, value: demo-cache} + + demo-ingress-health: + codeBundle: k8s-ingress-healthcheck + resources: [ingress-demo] + alias: Demo Ingress Health + statement: Health checks for demo public ingress + runbook: + configProvided: + - {name: NAMESPACE, value: ingress} + - {name: INGRESS_NAME, value: demo-public} + + demo-prod-aggregate: + codeBundle: k8s-aggregate-health + resources: [deploy-demo-app, sts-demo-cache] # N:1 — aggregate over two resources + alias: Production Aggregate Health + statement: Aggregate health across the demo workload + runbook: {} diff --git a/src/component.py b/src/component.py index 5dd857f11..797bceab5 100644 --- a/src/component.py +++ b/src/component.py @@ -248,8 +248,8 @@ def init_components(): # be added here, which is less than ideal, although practically may not be # a huge deal. component_stages_init = ( - (Stage.INDEXER, ["load_resources", "kubeapi", "cloudquery", "azure_devops"]), - (Stage.ENRICHER, ["generation_rules"]), + (Stage.INDEXER, ["load_resources", "kubeapi", "cloudquery", "azure_devops", "test_synth"]), + (Stage.ENRICHER, ["generation_rules", "test_groups"]), (Stage.RENDERER, ["render_output_items", "dump_resources"]) ) for stage, component_names in component_stages_init: diff --git a/src/enrichers/test_groups.py b/src/enrichers/test_groups.py new file mode 100644 index 000000000..cc0f9e7dd --- /dev/null +++ b/src/enrichers/test_groups.py @@ -0,0 +1,124 @@ +"""Simulator companion enricher that populates slxGroups and slxRelationships +from the testConfig YAML. + +Runs after generation_rules so the SLXS_PROPERTY context property is fully +populated; we use it to translate user-facing SLX slugs into the qualified +{workspace}--{slx-dir} full names that workspace.yaml's slxGroups list and +slxRelationships list reference. +""" +import yaml + +from component import Context, SettingDependency +from enrichers.generation_rules import ( + SLXS_PROPERTY, + GROUPS_PROPERTY, + SLX_RELATIONSHIPS_PROPERTY, + Group, + SLXRelationship, +) +from enrichers.map_customization_rules import RelationshipVerb +from indexers.test_synth import TEST_CONFIG_SETTING + +DOCUMENTATION = ( + "Populate slxGroups and slxRelationships from the simulator's testConfig " + "(only active when test_synth has populated SLXs from the same config)." +) + +SETTINGS = ( + SettingDependency(TEST_CONFIG_SETTING, False), +) + + +def _slug_to_full_name_map(slxs_by_full_name: dict) -> dict: + """Build slug -> workspace-prefixed full SLX name map from the SLXs that + the gen rule produced. + + The passthrough rule uses qualifiers: [resource], so each SLXInfo's + qualifier_values[0] is the slug (= the resource name = the test config + dict key). The value we want for slxGroups / slxRelationships is the + workspace-prefixed metadata.name, which is exposed as slx_info.name (NOT + slx_info.full_name — that's the unprefixed qualified name). + """ + slug_to_full = {} + for slx_info in slxs_by_full_name.values(): + qualifier_values = getattr(slx_info, "qualifier_values", None) + prefixed_name = getattr(slx_info, "name", None) + if qualifier_values and prefixed_name: + slug = qualifier_values[0] + slug_to_full[slug] = prefixed_name + return slug_to_full + + +def _resolve_verb(verb_str: str) -> str: + """Resolve a user-supplied verb string into the canonical enum value. + + Returns the verb's string representation (not the enum object) because + workspace.yaml's template stringifies the attribute via {{ verb }} and + Python's default Enum.__str__ produces "RelationshipVerb.DEPENDENT_ON" + instead of "dependent-on". Storing the value directly sidesteps the + template's lack of a .value lookup. + """ + if not verb_str: + return RelationshipVerb.DEPENDENT_ON.value + try: + return RelationshipVerb(verb_str).value + except ValueError: + return verb_str + + +def enrich(context: Context): + config_text = context.get_setting(TEST_CONFIG_SETTING) + if not config_text: + return + + config = yaml.safe_load(config_text) or {} + groups_config = config.get("slxGroups") or [] + relationships_config = config.get("slxRelationships") or [] + + if not groups_config and not relationships_config: + return + + slxs_by_full_name = context.get_property(SLXS_PROPERTY) or {} + slug_to_full = _slug_to_full_name_map(slxs_by_full_name) + + if groups_config: + groups = context.get_property(GROUPS_PROPERTY) + if groups is None: + groups = {} + context.set_property(GROUPS_PROPERTY, groups) + for group_cfg in groups_config: + name = group_cfg.get("name") + if not name: + continue + slx_slugs = group_cfg.get("slxs") or [] + slx_full_names = [ + slug_to_full[s] for s in slx_slugs if s in slug_to_full + ] + dependencies = group_cfg.get("dependsOn") or [] + existing = groups.get(name) + if existing is None: + groups[name] = Group(name, slx_full_names, dependencies) + else: + for slx in slx_full_names: + if slx not in existing.slxs: + existing.add_slx(slx) + for dep in dependencies: + if dep not in existing.dependencies: + existing.add_dependency(dep) + + if relationships_config: + slx_relationships = context.get_property(SLX_RELATIONSHIPS_PROPERTY) + if slx_relationships is None: + slx_relationships = [] + context.set_property(SLX_RELATIONSHIPS_PROPERTY, slx_relationships) + for rel_cfg in relationships_config: + subject_slug = rel_cfg.get("subject") + object_slug = rel_cfg.get("object") + verb = _resolve_verb(rel_cfg.get("verb")) + if not (subject_slug and object_slug): + continue + subject_full = slug_to_full.get(subject_slug, subject_slug) + object_full = slug_to_full.get(object_slug, object_slug) + slx_relationships.append( + SLXRelationship(subject_full, verb, object_full) + ) diff --git a/src/indexers/test_synth.py b/src/indexers/test_synth.py new file mode 100644 index 000000000..04acf63c3 --- /dev/null +++ b/src/indexers/test_synth.py @@ -0,0 +1,371 @@ +import yaml + +from component import Context, Setting, SettingDependency +from indexers.kubetypes import KUBERNETES_PLATFORM, KubernetesResourceType +from resources import Registry, REGISTRY_PROPERTY_NAME +from enrichers.generation_rule_types import LevelOfDetail + +DOCUMENTATION = "Synthesize deterministic test resources for E2E upload tests" + +# Resources are registered under the existing kubernetes platform so that the +# already-registered KubernetesPlatformHandler picks them up. The simulator's +# inventory effectively pretends to be a kubernetes inventory. +TEST_PLATFORM_NAME = KUBERNETES_PLATFORM + +# Anchor resources (the ones the passthrough rule matches) are always typed as +# deployments. The user-facing "kind" specified in inventory.resources is what +# flows into the rendered SLX's tags/additionalContext, but the gen rule only +# needs a single resource type to match against. +ANCHOR_RESOURCE_TYPE_NAME = KubernetesResourceType.DEPLOYMENT.value + +# Default cluster/namespace identity used when the test config omits an +# inventory section entirely. Preserves backwards compatibility with simpler +# test configs. +DEFAULT_CLUSTER_NAME = "simulator-cluster" +DEFAULT_NAMESPACE_NAME = "simulator" + +TEST_CONFIG_SETTING = Setting( + "TEST_CONFIG", + "testConfig", + Setting.Type.STRING, + "YAML string describing SLXs to synthesize for the simulator pipeline", +) + +SETTINGS = ( + SettingDependency(TEST_CONFIG_SETTING, False), +) + + +def _ensure_cluster(registry: Registry, name: str): + cluster = registry.lookup_resource( + KUBERNETES_PLATFORM, KubernetesResourceType.CLUSTER.value, name + ) + if cluster is None: + cluster = registry.add_resource( + platform_name=KUBERNETES_PLATFORM, + resource_type_name=KubernetesResourceType.CLUSTER.value, + resource_name=name, + resource_qualified_name=name, + resource_attributes={ + "context": name, + "namespaces": {}, + }, + ) + return cluster + + +def _ensure_namespace(registry: Registry, cluster, ns_name: str, + labels: dict, annotations: dict): + qualified_name = f"{cluster.name}/{ns_name}" + namespace = registry.lookup_resource( + KUBERNETES_PLATFORM, + KubernetesResourceType.NAMESPACE.value, + qualified_name, + ) + if namespace is None: + namespace = registry.add_resource( + platform_name=KUBERNETES_PLATFORM, + resource_type_name=KubernetesResourceType.NAMESPACE.value, + resource_name=ns_name, + resource_qualified_name=qualified_name, + resource_attributes={ + "cluster": cluster, + "lod": LevelOfDetail.DETAILED, + "labels": labels or {}, + "annotations": annotations or {}, + }, + ) + return namespace + + +def _build_inventory_index(registry: Registry, inventory_config: dict) -> dict: + """Materialize the cluster/namespace topology and build a lookup map from + resource id (as referenced by SLX entries) to its info dict.""" + inventory_index: dict = {} + cluster_lookup: dict = {} + ns_lookup: dict = {} # (cluster_name, ns_name) -> namespace resource + + for cluster_cfg in inventory_config.get("clusters") or []: + cluster_name = cluster_cfg["name"] + cluster = _ensure_cluster(registry, cluster_name) + cluster_lookup[cluster_name] = cluster + for ns_cfg in cluster_cfg.get("namespaces") or []: + ns_name = ns_cfg["name"] if isinstance(ns_cfg, dict) else ns_cfg + ns_labels = ns_cfg.get("labels") if isinstance(ns_cfg, dict) else {} + ns_annotations = ns_cfg.get("annotations") if isinstance(ns_cfg, dict) else {} + namespace = _ensure_namespace( + registry, cluster, ns_name, ns_labels, ns_annotations + ) + ns_lookup[(cluster_name, ns_name)] = namespace + + for resource_cfg in inventory_config.get("resources") or []: + resource_id = resource_cfg["id"] + cluster_name = resource_cfg["cluster"] + ns_name = resource_cfg["namespace"] + kind = resource_cfg.get("kind", "Deployment") + resource_name = resource_cfg.get("name", resource_id) + labels = resource_cfg.get("labels") or {} + annotations = resource_cfg.get("annotations") or {} + # Cluster + namespace must already exist (declared above) for a clean + # inventory; if not, auto-create as a defensive default. + cluster = cluster_lookup.get(cluster_name) or _ensure_cluster(registry, cluster_name) + cluster_lookup.setdefault(cluster_name, cluster) + namespace = ns_lookup.get((cluster_name, ns_name)) + if namespace is None: + namespace = _ensure_namespace(registry, cluster, ns_name, {}, {}) + ns_lookup[(cluster_name, ns_name)] = namespace + inventory_index[resource_id] = { + "id": resource_id, + "kind": kind, + "name": resource_name, + "cluster": cluster_name, + "namespace": ns_name, + "namespace_resource": namespace, + "labels": labels, + "annotations": annotations, + "qualified_name": f"{cluster_name}/{ns_name}/{resource_name}", + "resource_path": f"kubernetes/{cluster_name}/{ns_name}/{resource_name}", + } + return inventory_index + + +def _auto_derived_attrs_from_inventory(primary: dict, children: list[dict]) -> dict: + """Compute SLX tags + additionalContext + qualifiers from the bound + inventory resource(s). The first resource is the "primary"; any others are + treated as child resources in additionalContext.""" + cluster = primary["cluster"] + namespace = primary["namespace"] + kind = primary["kind"] + resource_name = primary["name"] + qualified_name = primary["qualified_name"] + resource_path = primary["resource_path"] + + tags = [ + {"name": "platform", "value": "kubernetes"}, + {"name": "cluster", "value": cluster}, + {"name": "namespace", "value": namespace}, + {"name": "kind", "value": kind}, + {"name": "resource_name", "value": resource_name}, + {"name": "resource_type", "value": kind.lower()}, + ] + for label_key, label_value in (primary.get("labels") or {}).items(): + tags.append({"name": f"[k8s]{label_key}", "value": str(label_value)}) + + additional_context = { + "hierarchy": ["platform", "cluster", "resource_name"], + "qualified_name": qualified_name, + "resourcePath": resource_path, + } + if children: + additional_context["childResources"] = [ + { + "kind": c["kind"], + "name": c["name"], + "qualified_name": c["qualified_name"], + "resourcePath": c["resource_path"], + } + for c in children + ] + + qualifiers = { + "namespace": namespace, + "cluster": cluster, + } + + return { + "tags": tags, + "additional_context": additional_context, + "qualifiers": qualifiers, + } + + +_OUTPUT_ITEM_KEYS = ("runbook", "sli", "slo") + + +def _merge_with_defaults(entry: dict, defaults: dict) -> dict: + """Merge a per-SLX entry on top of the test config's `defaults` block. + + Top-level scalar/list fields: per-SLX value wins; if absent, default is + used. Output-item subdicts (runbook/sli/slo): if the SLX has the key, + deep-merge with the matching default subdict (per-SLX keys win); if the + SLX doesn't have the key, the output item is NOT rendered (defaults + alone don't trigger rendering — the SLX must opt in). + """ + if not defaults: + return dict(entry) + + merged = {} + # Top-level fields from defaults first + for key, value in defaults.items(): + if key in _OUTPUT_ITEM_KEYS: + continue + merged[key] = value + # Top-level fields from entry override + for key, value in entry.items(): + if key in _OUTPUT_ITEM_KEYS: + continue + merged[key] = value + # Output items: merge only when the SLX has opted in by setting the key + for output_key in _OUTPUT_ITEM_KEYS: + if output_key not in entry: + continue + default_sub = defaults.get(output_key) or {} + entry_sub = entry.get(output_key) or {} + # Per-SLX keys win over defaults at this nested level too. + merged_sub = dict(default_sub) + merged_sub.update(entry_sub) + merged[output_key] = merged_sub + return merged + + +def _autoderive_path_to_robot(subdict: dict, code_bundle: str, item_kind: str) -> dict: + """If a runbook/sli/slo subdict is non-empty but lacks pathToRobot, fill + it in with the conventional codebundles//.robot path.""" + if not subdict: + return subdict + if "pathToRobot" in subdict and subdict["pathToRobot"]: + return subdict + result = dict(subdict) + result["pathToRobot"] = f"codebundles/{code_bundle}/{item_kind}.robot" + return result + + +def _resource_attrs_from_slx_entry( + slug: str, entry: dict, inventory_index: dict +) -> tuple[dict, str, str]: + """Project a test config SLX entry into the attribute dict for an anchor + resource. Returns (attrs, cluster_name, namespace_name) so the caller can + attach the anchor to the right namespace. + + Behavior: + - If the entry has `resources: [, ...]`, look up those inventory + resources and auto-derive tags/additionalContext/qualifiers. + - Else, fall back to the simulator-default cluster/namespace. + - User-supplied tags/additionalContext on the entry override auto-derived + ones (merged at top level). + """ + code_bundle = entry["codeBundle"] + runbook = _autoderive_path_to_robot(entry.get("runbook") or {}, code_bundle, "runbook") + sli = entry.get("sli") or None + if sli: + sli = _autoderive_path_to_robot(sli, code_bundle, "sli") + slo = entry.get("slo") or None + if slo: + slo = _autoderive_path_to_robot(slo, code_bundle, "slo") + + base_attrs = { + "slx_slug": slug, + "level_of_detail": entry.get("levelOfDetail", "basic"), + "code_collection": entry["codeCollection"], + "code_bundle": code_bundle, + "repo_url": entry.get("repoURL", ""), + "ref": entry.get("ref", "main"), + "alias": entry.get("alias", slug), + "statement": entry.get("statement", ""), + "as_measured_by": entry.get("asMeasuredBy", ""), + "image_url": entry.get("imageURL", ""), + "owners": entry.get("owners", []), + "config_provided": entry.get("configProvided", []), + "runbook": runbook, + "sli": sli, + "slo": slo, + } + + # Auto-derive from bound inventory resources, or fall back to defaults. + resource_ids = entry.get("resources") or [] + bound_resources = [inventory_index[rid] for rid in resource_ids if rid in inventory_index] + + if bound_resources: + primary, *children = bound_resources + derived = _auto_derived_attrs_from_inventory(primary, children) + cluster_name = primary["cluster"] + namespace_name = primary["namespace"] + else: + derived = { + "tags": [ + {"name": "platform", "value": "kubernetes"}, + {"name": "cluster", "value": DEFAULT_CLUSTER_NAME}, + {"name": "namespace", "value": DEFAULT_NAMESPACE_NAME}, + {"name": "kind", "value": "Deployment"}, + {"name": "resource_name", "value": slug}, + {"name": "resource_type", "value": "deployment"}, + ], + "additional_context": { + "hierarchy": ["platform", "cluster", "resource_name"], + "qualified_name": f"{DEFAULT_CLUSTER_NAME}/{DEFAULT_NAMESPACE_NAME}/{slug}", + "resourcePath": f"kubernetes/{DEFAULT_CLUSTER_NAME}/{DEFAULT_NAMESPACE_NAME}/{slug}", + }, + "qualifiers": { + "namespace": DEFAULT_NAMESPACE_NAME, + "cluster": DEFAULT_CLUSTER_NAME, + }, + } + cluster_name = DEFAULT_CLUSTER_NAME + namespace_name = DEFAULT_NAMESPACE_NAME + + # User overrides (if any) win over auto-derived values. + user_tags = entry.get("tags") + user_additional_context = entry.get("additionalContext") + base_attrs["tags"] = user_tags if user_tags is not None else derived["tags"] + if user_additional_context is not None: + merged_ctx = dict(derived["additional_context"]) + merged_ctx.update(user_additional_context) + base_attrs["additional_context"] = merged_ctx + else: + base_attrs["additional_context"] = derived["additional_context"] + base_attrs["sim_qualifiers"] = derived["qualifiers"] + + return base_attrs, cluster_name, namespace_name + + +def index(context: Context): + config_text = context.get_setting(TEST_CONFIG_SETTING) + if not config_text: + return + + config = yaml.safe_load(config_text) or {} + inventory_config = config.get("inventory") or {} + defaults_config = config.get("defaults") or {} + slxs = config.get("slxs") or {} + + registry: Registry = context.get_property(REGISTRY_PROPERTY_NAME) + if registry is None: + registry = Registry() + context.set_property(REGISTRY_PROPERTY_NAME, registry) + + inventory_index = _build_inventory_index(registry, inventory_config) + + # Ensure the default cluster + namespace exist for SLXs that didn't bind to + # an inventory resource. + default_cluster = _ensure_cluster(registry, DEFAULT_CLUSTER_NAME) + _ensure_namespace(registry, default_cluster, DEFAULT_NAMESPACE_NAME, {}, {}) + + for slug, entry in slxs.items(): + merged_entry = _merge_with_defaults(entry, defaults_config) + attrs, cluster_name, namespace_name = _resource_attrs_from_slx_entry( + slug, merged_entry, inventory_index + ) + # Anchor resource attached to its (cluster, namespace). + anchor_namespace = registry.lookup_resource( + KUBERNETES_PLATFORM, + KubernetesResourceType.NAMESPACE.value, + f"{cluster_name}/{namespace_name}", + ) + if anchor_namespace is None: + cluster = _ensure_cluster(registry, cluster_name) + anchor_namespace = _ensure_namespace( + registry, cluster, namespace_name, {}, {} + ) + + attrs["namespace"] = anchor_namespace + attrs["kind"] = "Deployment" + attrs["labels"] = {} + attrs["annotations"] = {} + qualified_name = f"{cluster_name}/{namespace_name}/{slug}" + registry.add_resource( + platform_name=TEST_PLATFORM_NAME, + resource_type_name=ANCHOR_RESOURCE_TYPE_NAME, + resource_name=slug, + resource_qualified_name=qualified_name, + resource_attributes=attrs, + ) diff --git a/src/poetry.lock b/src/poetry.lock new file mode 100644 index 000000000..ff0a0322b --- /dev/null +++ b/src/poetry.lock @@ -0,0 +1,1396 @@ +# This file is automatically @generated by Poetry 2.4.0 and should not be changed by hand. + +[[package]] +name = "asgiref" +version = "3.11.1" +description = "ASGI specs, helper code, and adapters" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "asgiref-3.11.1-py3-none-any.whl", hash = "sha256:e8667a091e69529631969fd45dc268fa79b99c92c5fcdda727757e52146ec133"}, + {file = "asgiref-3.11.1.tar.gz", hash = "sha256:5f184dc43b7e763efe848065441eac62229c9f7b0475f41f80e207a114eda4ce"}, +] + +[package.extras] +tests = ["mypy (>=1.14.0)", "pytest", "pytest-asyncio"] + +[[package]] +name = "attrs" +version = "26.1.0" +description = "Classes Without Boilerplate" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "attrs-26.1.0-py3-none-any.whl", hash = "sha256:c647aa4a12dfbad9333ca4e71fe62ddc36f4e63b2d260a37a8b83d2f043ac309"}, + {file = "attrs-26.1.0.tar.gz", hash = "sha256:d03ceb89cb322a8fd706d4fb91940737b6642aa36998fe130a9bc96c985eff32"}, +] + +[[package]] +name = "azure-common" +version = "1.1.28" +description = "Microsoft Azure Client Library for Python (Common)" +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "azure-common-1.1.28.zip", hash = "sha256:4ac0cd3214e36b6a1b6a442686722a5d8cc449603aa833f3f0f40bda836704a3"}, + {file = "azure_common-1.1.28-py2.py3-none-any.whl", hash = "sha256:5c12d3dcf4ec20599ca6b0d3e09e86e146353d443e7fcc050c9a19c1f9df20ad"}, +] + +[[package]] +name = "azure-core" +version = "1.40.0" +description = "Microsoft Azure Core Library for Python" +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "azure_core-1.40.0-py3-none-any.whl", hash = "sha256:7f3ea02579b1bb1d34e45043423b650621d11d7c2ea3b05e5554010080b78dfd"}, + {file = "azure_core-1.40.0.tar.gz", hash = "sha256:ecf5b6ddf2564471fae9d576147b7e77a4da285958b2d9f4fd6c3af104f3e9d7"}, +] + +[package.dependencies] +requests = ">=2.21.0" +typing-extensions = ">=4.6.0" + +[package.extras] +aio = ["aiohttp (>=3.0)"] +tracing = ["opentelemetry-api (>=1.26,<2.0)"] + +[[package]] +name = "azure-devops" +version = "7.1.0b4" +description = "Python wrapper around the Azure DevOps 7.x APIs" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "azure-devops-7.1.0b4.tar.gz", hash = "sha256:f04ba939112579f3d530cfecc044a74ef9e9339ba23c9ee1ece248241f07ff85"}, + {file = "azure_devops-7.1.0b4-py3-none-any.whl", hash = "sha256:f827e9fbc7c77bc6f2aaee46e5717514e9fe7d676c87624eccd0ca640b54f122"}, +] + +[package.dependencies] +msrest = ">=0.7.1,<0.8.0" + +[[package]] +name = "azure-identity" +version = "1.25.3" +description = "Microsoft Azure Identity Library for Python" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "azure_identity-1.25.3-py3-none-any.whl", hash = "sha256:f4d0b956a8146f30333e071374171f3cfa7bdb8073adb8c3814b65567aa7447c"}, + {file = "azure_identity-1.25.3.tar.gz", hash = "sha256:ab23c0d63015f50b630ef6c6cf395e7262f439ce06e5d07a64e874c724f8d9e6"}, +] + +[package.dependencies] +azure-core = ">=1.31.0" +cryptography = ">=2.5" +msal = ">=1.35.1" +msal-extensions = ">=1.2.0" +typing-extensions = ">=4.0.0" + +[[package]] +name = "azure-mgmt-containerservice" +version = "40.2.0" +description = "Microsoft Azure Containerservice Management Client Library for Python" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "azure_mgmt_containerservice-40.2.0-py3-none-any.whl", hash = "sha256:a73d9720dd0ae29257dd6dcf39942b9274f15b588ffe25f4564aa77798eb4703"}, + {file = "azure_mgmt_containerservice-40.2.0.tar.gz", hash = "sha256:acd55cae95b768efeb0377d83dea07d610c434eec0c089e02935ff31f0e3e07d"}, +] + +[package.dependencies] +azure-mgmt-core = ">=1.6.0" +msrest = ">=0.7.1" +typing-extensions = ">=4.6.0" + +[[package]] +name = "azure-mgmt-core" +version = "1.6.0" +description = "Microsoft Azure Management Core Library for Python" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "azure_mgmt_core-1.6.0-py3-none-any.whl", hash = "sha256:0460d11e85c408b71c727ee1981f74432bc641bb25dfcf1bb4e90a49e776dbc4"}, + {file = "azure_mgmt_core-1.6.0.tar.gz", hash = "sha256:b26232af857b021e61d813d9f4ae530465255cb10b3dde945ad3743f7a58e79c"}, +] + +[package.dependencies] +azure-core = ">=1.32.0" + +[[package]] +name = "azure-mgmt-resource" +version = "24.0.0" +description = "Microsoft Azure Resource Management Client Library for Python" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "azure_mgmt_resource-24.0.0-py3-none-any.whl", hash = "sha256:27b32cd223e2784269f5a0db3c282042886ee4072d79cedc638438ece7cd0df4"}, + {file = "azure_mgmt_resource-24.0.0.tar.gz", hash = "sha256:cf6b8995fcdd407ac9ff1dd474087129429a1d90dbb1ac77f97c19b96237b265"}, +] + +[package.dependencies] +azure-common = ">=1.1" +azure-mgmt-core = ">=1.5.0" +isodate = ">=0.6.1" +typing-extensions = ">=4.6.0" + +[[package]] +name = "boto3" +version = "1.43.3" +description = "The AWS SDK for Python" +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "boto3-1.43.3-py3-none-any.whl", hash = "sha256:fb9fe51849ef2a78198d582756fc06f14f7de27f73e0fa90275d6aa4171eb4d0"}, + {file = "boto3-1.43.3.tar.gz", hash = "sha256:7c7777862ffc898f05efa566032bbabfe226dbb810e35ec11125817f128bc5c5"}, +] + +[package.dependencies] +botocore = ">=1.43.3,<1.44.0" +jmespath = ">=0.7.1,<2.0.0" +s3transfer = ">=0.17.0,<0.18.0" + +[package.extras] +crt = ["botocore[crt] (>=1.21.0,<2.0a0)"] + +[[package]] +name = "botocore" +version = "1.43.3" +description = "Low-level, data-driven core of boto 3." +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "botocore-1.43.3-py3-none-any.whl", hash = "sha256:ec0769eb0f7c5034856bb406a92698dbc02a3d4be0f78a384747106b161d8ea3"}, + {file = "botocore-1.43.3.tar.gz", hash = "sha256:eac6da0fffccf87888ebf4d89f0b2378218a707efa748cd955b838995e944695"}, +] + +[package.dependencies] +jmespath = ">=0.7.1,<2.0.0" +python-dateutil = ">=2.1,<3.0.0" +urllib3 = ">=1.25.4,<2.2.0 || >2.2.0,<3" + +[package.extras] +crt = ["awscrt (==0.32.2)"] + +[[package]] +name = "certifi" +version = "2026.4.22" +description = "Python package for providing Mozilla's CA Bundle." +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "certifi-2026.4.22-py3-none-any.whl", hash = "sha256:3cb2210c8f88ba2318d29b0388d1023c8492ff72ecdde4ebdaddbb13a31b1c4a"}, + {file = "certifi-2026.4.22.tar.gz", hash = "sha256:8d455352a37b71bf76a79caa83a3d6c25afee4a385d632127b6afb3963f1c580"}, +] + +[[package]] +name = "cffi" +version = "2.0.0" +description = "Foreign Function Interface for Python calling C code." +optional = false +python-versions = ">=3.9" +groups = ["main"] +markers = "platform_python_implementation != \"PyPy\"" +files = [ + {file = "cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44"}, + {file = "cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4"}, + {file = "cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453"}, + {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495"}, + {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5"}, + {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb"}, + {file = "cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a"}, + {file = "cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739"}, + {file = "cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe"}, + {file = "cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664"}, + {file = "cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26"}, + {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9"}, + {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414"}, + {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743"}, + {file = "cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5"}, + {file = "cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5"}, + {file = "cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d"}, + {file = "cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d"}, + {file = "cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037"}, + {file = "cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba"}, + {file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94"}, + {file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187"}, + {file = "cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18"}, + {file = "cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5"}, + {file = "cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6"}, + {file = "cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb"}, + {file = "cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3"}, + {file = "cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26"}, + {file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c"}, + {file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b"}, + {file = "cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27"}, + {file = "cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75"}, + {file = "cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91"}, + {file = "cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5"}, + {file = "cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef"}, + {file = "cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775"}, + {file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205"}, + {file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1"}, + {file = "cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f"}, + {file = "cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25"}, + {file = "cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad"}, + {file = "cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9"}, + {file = "cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc"}, + {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592"}, + {file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512"}, + {file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4"}, + {file = "cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e"}, + {file = "cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6"}, + {file = "cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9"}, + {file = "cffi-2.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf"}, + {file = "cffi-2.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f"}, + {file = "cffi-2.0.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63"}, + {file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2"}, + {file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65"}, + {file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322"}, + {file = "cffi-2.0.0-cp39-cp39-win32.whl", hash = "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a"}, + {file = "cffi-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9"}, + {file = "cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529"}, +] + +[package.dependencies] +pycparser = {version = "*", markers = "implementation_name != \"PyPy\""} + +[[package]] +name = "charset-normalizer" +version = "3.4.7" +description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "charset_normalizer-3.4.7-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cdd68a1fb318e290a2077696b7eb7a21a49163c455979c639bf5a5dcdc46617d"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e17b8d5d6a8c47c85e68ca8379def1303fd360c3e22093a807cd34a71cd082b8"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:511ef87c8aec0783e08ac18565a16d435372bc1ac25a91e6ac7f5ef2b0bff790"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:007d05ec7321d12a40227aae9e2bc6dca73f3cb21058999a1df9e193555a9dcc"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cf29836da5119f3c8a8a70667b0ef5fdca3bb12f80fd06487cfa575b3909b393"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux_2_31_armv7l.whl", hash = "sha256:12d8baf840cc7889b37c7c770f478adea7adce3dcb3944d02ec87508e2dcf153"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:d560742f3c0d62afaccf9f41fe485ed69bd7661a241f86a3ef0f0fb8b1a397af"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:b14b2d9dac08e28bb8046a1a0434b1750eb221c8f5b87a68f4fa11a6f97b5e34"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:bc17a677b21b3502a21f66a8cc64f5bfad4df8a0b8434d661666f8ce90ac3af1"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:750e02e074872a3fad7f233b47734166440af3cdea0add3e95163110816d6752"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:4e5163c14bffd570ef2affbfdd77bba66383890797df43dc8b4cc7d6f500bf53"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:6ed74185b2db44f41ef35fd1617c5888e59792da9bbc9190d6c7300617182616"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:94e1885b270625a9a828c9793b4d52a64445299baa1fea5a173bf1d3dd9a1a5a"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-win32.whl", hash = "sha256:6785f414ae0f3c733c437e0f3929197934f526d19dfaa75e18fdb4f94c6fb374"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-win_amd64.whl", hash = "sha256:6696b7688f54f5af4462118f0bfa7c1621eeb87154f77fa04b9295ce7a8f2943"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-win_arm64.whl", hash = "sha256:66671f93accb62ed07da56613636f3641f1a12c13046ce91ffc923721f23c008"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7641bb8895e77f921102f72833904dcd9901df5d6d72a2ab8f31d04b7e51e4e7"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:202389074300232baeb53ae2569a60901f7efadd4245cf3a3bf0617d60b439d7"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:30b8d1d8c52a48c2c5690e152c169b673487a2a58de1ec7393196753063fcd5e"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:532bc9bf33a68613fd7d65e4b1c71a6a38d7d42604ecf239c77392e9b4e8998c"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2fe249cb4651fd12605b7288b24751d8bfd46d35f12a20b1ba33dea122e690df"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_armv7l.whl", hash = "sha256:65bcd23054beab4d166035cabbc868a09c1a49d1efe458fe8e4361215df40265"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:08e721811161356f97b4059a9ba7bafb23ea5ee2255402c42881c214e173c6b4"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e060d01aec0a910bdccb8be71faf34e7799ce36950f8294c8bf612cba65a2c9e"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:38c0109396c4cfc574d502df99742a45c72c08eff0a36158b6f04000043dbf38"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:1c2a768fdd44ee4a9339a9b0b130049139b8ce3c01d2ce09f67f5a68048d477c"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:1a87ca9d5df6fe460483d9a5bbf2b18f620cbed41b432e2bddb686228282d10b"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:d635aab80466bc95771bb78d5370e74d36d1fe31467b6b29b8b57b2a3cd7d22c"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ae196f021b5e7c78e918242d217db021ed2a6ace2bc6ae94c0fc596221c7f58d"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-win32.whl", hash = "sha256:adb2597b428735679446b46c8badf467b4ca5f5056aae4d51a19f9570301b1ad"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-win_amd64.whl", hash = "sha256:8e385e4267ab76874ae30db04c627faaaf0b509e1ccc11a95b3fc3e83f855c00"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-win_arm64.whl", hash = "sha256:d4a48e5b3c2a489fae013b7589308a40146ee081f6f509e047e0e096084ceca1"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:eca9705049ad3c7345d574e3510665cb2cf844c2f2dcfe675332677f081cbd46"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6178f72c5508bfc5fd446a5905e698c6212932f25bcdd4b47a757a50605a90e2"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1421b502d83040e6d7fb2fb18dff63957f720da3d77b2fbd3187ceb63755d7b"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:edac0f1ab77644605be2cbba52e6b7f630731fc42b34cb0f634be1a6eface56a"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5649fd1c7bade02f320a462fdefd0b4bd3ce036065836d4f42e0de958038e116"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_armv7l.whl", hash = "sha256:203104ed3e428044fd943bc4bf45fa73c0730391f9621e37fe39ecf477b128cb"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:298930cec56029e05497a76988377cbd7457ba864beeea92ad7e844fe74cd1f1"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:708838739abf24b2ceb208d0e22403dd018faeef86ddac04319a62ae884c4f15"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:0f7eb884681e3938906ed0434f20c63046eacd0111c4ba96f27b76084cd679f5"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4dc1e73c36828f982bfe79fadf5919923f8a6f4df2860804db9a98c48824ce8d"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:aed52fea0513bac0ccde438c188c8a471c4e0f457c2dd20cdbf6ea7a450046c7"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:fea24543955a6a729c45a73fe90e08c743f0b3334bbf3201e6c4bc1b0c7fa464"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:bb6d88045545b26da47aa879dd4a89a71d1dce0f0e549b1abcb31dfe4a8eac49"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-win32.whl", hash = "sha256:2257141f39fe65a3fdf38aeccae4b953e5f3b3324f4ff0daf9f15b8518666a2c"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-win_amd64.whl", hash = "sha256:5ed6ab538499c8644b8a3e18debabcd7ce684f3fa91cf867521a7a0279cab2d6"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-win_arm64.whl", hash = "sha256:56be790f86bfb2c98fb742ce566dfb4816e5a83384616ab59c49e0604d49c51d"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f496c9c3cc02230093d8330875c4c3cdfc3b73612a5fd921c65d39cbcef08063"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ea948db76d31190bf08bd371623927ee1339d5f2a0b4b1b4a4439a65298703c"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a277ab8928b9f299723bc1a2dabb1265911b1a76341f90a510368ca44ad9ab66"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3bec022aec2c514d9cf199522a802bd007cd588ab17ab2525f20f9c34d067c18"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e044c39e41b92c845bc815e5ae4230804e8e7bc29e399b0437d64222d92809dd"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:f495a1652cf3fbab2eb0639776dad966c2fb874d79d87ca07f9d5f059b8bd215"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e712b419df8ba5e42b226c510472b37bd57b38e897d3eca5e8cfd410a29fa859"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7804338df6fcc08105c7745f1502ba68d900f45fd770d5bdd5288ddccb8a42d8"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:481551899c856c704d58119b5025793fa6730adda3571971af568f66d2424bb5"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f59099f9b66f0d7145115e6f80dd8b1d847176df89b234a5a6b3f00437aa0832"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:f59ad4c0e8f6bba240a9bb85504faa1ab438237199d4cce5f622761507b8f6a6"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:3dedcc22d73ec993f42055eff4fcfed9318d1eeb9a6606c55892a26964964e48"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:64f02c6841d7d83f832cd97ccf8eb8a906d06eb95d5276069175c696b024b60a"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-win32.whl", hash = "sha256:4042d5c8f957e15221d423ba781e85d553722fc4113f523f2feb7b188cc34c5e"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-win_amd64.whl", hash = "sha256:3946fa46a0cf3e4c8cb1cc52f56bb536310d34f25f01ca9b6c16afa767dab110"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-win_arm64.whl", hash = "sha256:80d04837f55fc81da168b98de4f4b797ef007fc8a79ab71c6ec9bc4dd662b15b"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:c36c333c39be2dbca264d7803333c896ab8fa7d4d6f0ab7edb7dfd7aea6e98c0"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c2aed2e5e41f24ea8ef1590b8e848a79b56f3a5564a65ceec43c9d692dc7d8a"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:54523e136b8948060c0fa0bc7b1b50c32c186f2fceee897a495406bb6e311d2b"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:715479b9a2802ecac752a3b0efa2b0b60285cf962ee38414211abdfccc233b41"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bd6c2a1c7573c64738d716488d2cdd3c00e340e4835707d8fdb8dc1a66ef164e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_armv7l.whl", hash = "sha256:c45e9440fb78f8ddabcf714b68f936737a121355bf59f3907f4e17721b9d1aae"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3534e7dcbdcf757da6b85a0bbf5b6868786d5982dd959b065e65481644817a18"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e8ac484bf18ce6975760921bb6148041faa8fef0547200386ea0b52b5d27bf7b"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a5fe03b42827c13cdccd08e6c0247b6a6d4b5e3cdc53fd1749f5896adcdc2356"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:2d6eb928e13016cea4f1f21d1e10c1cebd5a421bc57ddf5b1142ae3f86824fab"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e74327fb75de8986940def6e8dee4f127cc9752bee7355bb323cc5b2659b6d46"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d6038d37043bced98a66e68d3aa2b6a35505dc01328cd65217cefe82f25def44"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7579e913a5339fb8fa133f6bbcfd8e6749696206cf05acdbdca71a1b436d8e72"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-win32.whl", hash = "sha256:5b77459df20e08151cd6f8b9ef8ef1f961ef73d85c21a555c7eed5b79410ec10"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-win_amd64.whl", hash = "sha256:92a0a01ead5e668468e952e4238cccd7c537364eb7d851ab144ab6627dbbe12f"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-win_arm64.whl", hash = "sha256:67f6279d125ca0046a7fd386d01b311c6363844deac3e5b069b514ba3e63c246"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:effc3f449787117233702311a1b7d8f59cba9ced946ba727bdc329ec69028e24"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fbccdc05410c9ee21bbf16a35f4c1d16123dcdeb8a1d38f33654fa21d0234f79"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:733784b6d6def852c814bce5f318d25da2ee65dd4839a0718641c696e09a2960"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a89c23ef8d2c6b27fd200a42aa4ac72786e7c60d40efdc76e6011260b6e949c4"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6c114670c45346afedc0d947faf3c7f701051d2518b943679c8ff88befe14f8e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:a180c5e59792af262bf263b21a3c49353f25945d8d9f70628e73de370d55e1e1"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3c9a494bc5ec77d43cea229c4f6db1e4d8fe7e1bbffa8b6f0f0032430ff8ab44"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8d828b6667a32a728a1ad1d93957cdf37489c57b97ae6c4de2860fa749b8fc1e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:cf1493cd8607bec4d8a7b9b004e699fcf8f9103a9284cc94962cb73d20f9d4a3"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0c96c3b819b5c3e9e165495db84d41914d6894d55181d2d108cc1a69bfc9cce0"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:752a45dc4a6934060b3b0dab47e04edc3326575f82be64bc4fc293914566503e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:8778f0c7a52e56f75d12dae53ae320fae900a8b9b4164b981b9c5ce059cd1fcb"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ce3412fbe1e31eb81ea42f4169ed94861c56e643189e1e75f0041f3fe7020abe"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-win32.whl", hash = "sha256:c03a41a8784091e67a39648f70c5f97b5b6a37f216896d44d2cdcb82615339a0"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-win_amd64.whl", hash = "sha256:03853ed82eeebbce3c2abfdbc98c96dc205f32a79627688ac9a27370ea61a49c"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-win_arm64.whl", hash = "sha256:c35abb8bfff0185efac5878da64c45dafd2b37fb0383add1be155a763c1f083d"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e5f4d355f0a2b1a31bc3edec6795b46324349c9cb25eed068049e4f472fb4259"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:16d971e29578a5e97d7117866d15889a4a07befe0e87e703ed63cd90cb348c01"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:dca4bbc466a95ba9c0234ef56d7dd9509f63da22274589ebd4ed7f1f4d4c54e3"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e80c8378d8f3d83cd3164da1ad2df9e37a666cdde7b1cb2298ed0b558064be30"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:36836d6ff945a00b88ba1e4572d721e60b5b8c98c155d465f56ad19d68f23734"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux_2_31_armv7l.whl", hash = "sha256:bd9b23791fe793e4968dba0c447e12f78e425c59fc0e3b97f6450f4781f3ee60"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:aef65cd602a6d0e0ff6f9930fcb1c8fec60dd2cfcb6facaf4bdb0e5873042db0"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:82b271f5137d07749f7bf32f70b17ab6eaabedd297e75dce75081a24f76eb545"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:1efde3cae86c8c273f1eb3b287be7d8499420cf2fe7585c41d370d3e790054a5"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:c593052c465475e64bbfe5dbd81680f64a67fdc752c56d7a0ae205dc8aeefe0f"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_riscv64.whl", hash = "sha256:af21eb4409a119e365397b2adbaca4c9ccab56543a65d5dbd9f920d6ac29f686"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:84c018e49c3bf790f9c2771c45e9313a08c2c2a6342b162cd650258b57817706"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:dd915403e231e6b1809fe9b6d9fc55cf8fb5e02765ac625d9cd623342a7905d7"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-win32.whl", hash = "sha256:320ade88cfb846b8cd6b4ddf5ee9e80ee0c1f52401f2456b84ae1ae6a1a5f207"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-win_amd64.whl", hash = "sha256:1dc8b0ea451d6e69735094606991f32867807881400f808a106ee1d963c46a83"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:177a0ba5f0211d488e295aaf82707237e331c24788d8d76c96c5a41594723217"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6e0d51f618228538a3e8f46bd246f87a6cd030565e015803691603f55e12afb5"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:14265bfe1f09498b9d8ec91e9ec9fa52775edf90fcbde092b25f4a33d444fea9"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:87fad7d9ba98c86bcb41b2dc8dbb326619be2562af1f8ff50776a39e55721c5a"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f22dec1690b584cea26fade98b2435c132c1b5f68e39f5a0b7627cd7ae31f1dc"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux_2_31_armv7l.whl", hash = "sha256:d61f00a0869d77422d9b2aba989e2d24afa6ffd552af442e0e58de4f35ea6d00"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:6370e8686f662e6a3941ee48ed4742317cafbe5707e36406e9df792cdb535776"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:a6c5863edfbe888d9eff9c8b8087354e27618d9da76425c119293f11712a6319"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:ed065083d0898c9d5b4bbec7b026fd755ff7454e6e8b73a67f8c744b13986e24"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:2cd4a60d0e2fb04537162c62bbbb4182f53541fe0ede35cdf270a1c1e723cc42"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:813c0e0132266c08eb87469a642cb30aaff57c5f426255419572aaeceeaa7bf4"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:07d9e39b01743c3717745f4c530a6349eadbfa043c7577eef86c502c15df2c67"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:c0f081d69a6e58272819b70288d3221a6ee64b98df852631c80f293514d3b274"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-win32.whl", hash = "sha256:8751d2787c9131302398b11e6c8068053dcb55d5a8964e114b6e196cf16cb366"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-win_amd64.whl", hash = "sha256:12a6fff75f6bc66711b73a2f0addfc4c8c15a20e805146a02d147a318962c444"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-win_arm64.whl", hash = "sha256:bb8cc7534f51d9a017b93e3e85b260924f909601c3df002bcdb58ddb4dc41a5c"}, + {file = "charset_normalizer-3.4.7-py3-none-any.whl", hash = "sha256:3dce51d0f5e7951f8bb4900c257dad282f49190fdbebecd4ba99bcc41fef404d"}, + {file = "charset_normalizer-3.4.7.tar.gz", hash = "sha256:ae89db9e5f98a11a4bf50407d4363e7b09b31e55bc117b4f7d80aab97ba009e5"}, +] + +[[package]] +name = "cryptography" +version = "48.0.0" +description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers." +optional = false +python-versions = "!=3.9.0,!=3.9.1,>=3.9" +groups = ["main"] +files = [ + {file = "cryptography-48.0.0-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:0c558d2cdffd8f4bbb30fc7134c74d2ca9a476f830bb053074498fbc86f41ed6"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:f5333311663ea94f75dd408665686aaf426563556bb5283554a3539177e03b8c"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7995ef305d7165c3f11ae07f2517e5a4f1d5c18da1376a0a9ed496336b69e5f3"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:40ba1f85eaa6959837b1d51c9767e230e14612eea4ef110ee8854ada22da1bf5"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:369a6348999f94bbd53435c894377b20ab95f25a9065c283570e70150d8abc3c"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:a0e692c683f4df67815a2d258b324e66f4738bd7a96a218c826dce4f4bd05d8f"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:18349bbc56f4743c8b12dc32e2bccb2cf83ee8b69a3bba74ef8ae857e26b3d25"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:7e8eac43dfca5c4cccc6dad9a80504436fca53bb9bc3100a2386d730fbe6b602"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:9ccdac7d40688ecb5a3b4a604b8a88c8002e3442d6c60aead1db2a89a041560c"}, + {file = "cryptography-48.0.0-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:bd72e68b06bb1e96913f97dd4901119bc17f39d4586a5adf2d3e47bc2b9d58b5"}, + {file = "cryptography-48.0.0-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:59baa2cb386c4f0b9905bd6eb4c2a79a69a128408fd31d32ca4d7102d4156321"}, + {file = "cryptography-48.0.0-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:9249e3cd978541d665967ac2cb2787fd6a62bddf1e75b3e347a594d7dacf4f74"}, + {file = "cryptography-48.0.0-cp311-abi3-win32.whl", hash = "sha256:9c459db21422be75e2809370b829a87eb37f74cd785fc4aa9ea1e5f43b47cda4"}, + {file = "cryptography-48.0.0-cp311-abi3-win_amd64.whl", hash = "sha256:5b012212e08b8dd5edc78ef54da83dd9892fd9105323b3993eff6bea65dc21d7"}, + {file = "cryptography-48.0.0-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:3cb07a3ed6431663cd321ea8a000a1314c74211f823e4177fefa2255e057d1ec"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:8c7378637d7d88016fa6791c159f698b3d3eed28ebf844ac36b9dc04a14dae18"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:cc90c0b39b2e3c65ef52c804b72e3c58f8a04ab2a1871272798e5f9572c17d20"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:76341972e1eff8b4bea859f09c0d3e64b96ce931b084f9b9b7db8ef364c30eff"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:55b7718303bf06a5753dcdccf2f3945cf18ad7bffde41b61226e4db31ab89a9c"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:a64697c641c7b1b2178e573cbc31c7c6684cd56883a478d75143dbb7118036db"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:561215ea3879cb1cbbf272867e2efda62476f240fb58c64de6b393ae19246741"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:ad64688338ed4bc1a6618076ba75fd7194a5f1797ac60b47afe926285adb3166"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:906cbf0670286c6e0044156bc7d4af9cbb0ef6db9f73e52c3ec56ba6bdde5336"}, + {file = "cryptography-48.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:ea8990436d914540a40ab24b6a77c0969695ed52f4a4874c5137ccf7045a7057"}, + {file = "cryptography-48.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:c18684a7f0cc9a3cb60328f496b8e3372def7c5d2df39ac267878b05565aaaae"}, + {file = "cryptography-48.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:9be5aafa5736574f8f15f262adc81b2a9869e2cfe9014d52a44633905b40d52c"}, + {file = "cryptography-48.0.0-cp314-cp314t-win32.whl", hash = "sha256:c17dfe85494deaeddc5ce251aebd1d60bbe6afc8b62071bb0b469431a000124f"}, + {file = "cryptography-48.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:27241b1dc9962e056062a8eef1991d02c3a24569c95975bd2322a8a52c6e5e12"}, + {file = "cryptography-48.0.0-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:58d00498e8933e4a194f3076aee1b4a97dfec1a6da444535755822fe5d8b0b86"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:614d0949f4790582d2cc25553abd09dd723025f0c0e7c67376a1d77196743d6e"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7ce4bfae76319a532a2dc68f82cc32f5676ee792a983187dac07183690e5c66f"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:2eb992bbd4661238c5a397594c83f5b4dc2bc5b848c365c8f991b6780efcc5c7"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:22a5cb272895dce158b2cacdfdc3debd299019659f42947dbdac6f32d68fe832"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:2b4d59804e8408e2fea7d1fbaf218e5ec984325221db76e6a241a9abd6cdd95c"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux_2_31_armv7l.whl", hash = "sha256:984a20b0f62a26f48a3396c72e4bc34c66e356d356bf370053066b3b6d54634a"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5a5ed8fde7a1d09376ca0b40e68cd59c69fe23b1f9768bd5824f54681626032a"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:8cd666227ef7af430aa5914a9910e0ddd703e75f039cef0825cd0da71b6b711a"}, + {file = "cryptography-48.0.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:9071196d81abc88b3516ac8cdfad32e2b66dd4a5393a8e68a961e9161ddc6239"}, + {file = "cryptography-48.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:1e2d54c8be6152856a36f0882ab231e70f8ec7f14e93cf87db8a2ed056bf160c"}, + {file = "cryptography-48.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a5da777e32ffed6f85a7b2b3f7c5cbc88c146bfcd0a1d7baf5fcc6c52ee35dd4"}, + {file = "cryptography-48.0.0-cp39-abi3-win32.whl", hash = "sha256:77a2ccbbe917f6710e05ba9adaa25fb5075620bf3ea6fb751997875aff4ae4bd"}, + {file = "cryptography-48.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:16cd65b9330583e4619939b3a3843eec1e6e789744bb01e7c7e2e62e33c239c8"}, + {file = "cryptography-48.0.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:84cf79f0dc8b36ac5da873481716e87aef31fcfa0444f9e1d8b4b2cece142855"}, + {file = "cryptography-48.0.0-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:fdfef35d751d510fcef5252703621574364fec16418c4a1e5e1055248401054b"}, + {file = "cryptography-48.0.0-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:0890f502ddf7d9c6426129c3f49f5c0a39278ed7cd6322c8755ffca6ee675a13"}, + {file = "cryptography-48.0.0-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:ecde28a596bead48b0cfd2a1b4416c3d43074c2d785e3a398d7ec1fc4d0f7fbb"}, + {file = "cryptography-48.0.0-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:4defde8685ae324a9eb9d818717e93b4638ef67070ac9bc15b8ca85f63048355"}, + {file = "cryptography-48.0.0-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:db63bf618e5dea46c07de12e900fe1cdd2541e6dc9dbae772a70b7d4d4765f6a"}, + {file = "cryptography-48.0.0.tar.gz", hash = "sha256:5c3932f4436d1cccb036cb0eaef46e6e2db91035166f1ad6505c3c9d5a635920"}, +] + +[package.dependencies] +cffi = {version = ">=2.0.0", markers = "platform_python_implementation != \"PyPy\""} + +[package.extras] +ssh = ["bcrypt (>=3.1.5)"] + +[[package]] +name = "django" +version = "5.2.14" +description = "A high-level Python web framework that encourages rapid development and clean, pragmatic design." +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "django-5.2.14-py3-none-any.whl", hash = "sha256:6f712143bd3064310d1f50fac859c3e9a274bdcfc9595339853be7779297fc76"}, + {file = "django-5.2.14.tar.gz", hash = "sha256:58a63ba841662e5c686b57ba1fec52ddd68c0b93bd96ac3029d55728f00bf8a2"}, +] + +[package.dependencies] +asgiref = ">=3.8.1" +sqlparse = ">=0.3.1" +tzdata = {version = "*", markers = "sys_platform == \"win32\""} + +[package.extras] +argon2 = ["argon2-cffi (>=19.1.0)"] +bcrypt = ["bcrypt"] + +[[package]] +name = "django-extensions" +version = "3.2.3" +description = "Extensions for Django" +optional = false +python-versions = ">=3.6" +groups = ["main"] +files = [ + {file = "django-extensions-3.2.3.tar.gz", hash = "sha256:44d27919d04e23b3f40231c4ab7af4e61ce832ef46d610cc650d53e68328410a"}, + {file = "django_extensions-3.2.3-py3-none-any.whl", hash = "sha256:9600b7562f79a92cbf1fde6403c04fee314608fefbb595502e34383ae8203401"}, +] + +[package.dependencies] +Django = ">=3.2" + +[[package]] +name = "djangorestframework" +version = "3.17.1" +description = "Web APIs for Django, made easy." +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "djangorestframework-3.17.1-py3-none-any.whl", hash = "sha256:c3c74dd3e83a5a3efc37b3c18d92bd6f86a6791c7b7d4dff62bb068500e76457"}, + {file = "djangorestframework-3.17.1.tar.gz", hash = "sha256:a6def5f447fe78ff853bff1d47a3c59bf38f5434b031780b351b0c73a62db1a5"}, +] + +[package.dependencies] +django = ">=4.2" + +[[package]] +name = "durationpy" +version = "0.10" +description = "Module for converting between datetime.timedelta and Go's Duration strings." +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "durationpy-0.10-py3-none-any.whl", hash = "sha256:3b41e1b601234296b4fb368338fdcd3e13e0b4fb5b67345948f4f2bf9868b286"}, + {file = "durationpy-0.10.tar.gz", hash = "sha256:1fa6893409a6e739c9c72334fc65cca1f355dbdd93405d30f726deb5bde42fba"}, +] + +[[package]] +name = "gitdb" +version = "4.0.12" +description = "Git Object Database" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "gitdb-4.0.12-py3-none-any.whl", hash = "sha256:67073e15955400952c6565cc3e707c554a4eea2e428946f7a4c162fab9bd9bcf"}, + {file = "gitdb-4.0.12.tar.gz", hash = "sha256:5ef71f855d191a3326fcfbc0d5da835f26b13fbcba60c32c21091c349ffdb571"}, +] + +[package.dependencies] +smmap = ">=3.0.1,<6" + +[[package]] +name = "gitpython" +version = "3.1.49" +description = "GitPython is a Python library used to interact with Git repositories" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "gitpython-3.1.49-py3-none-any.whl", hash = "sha256:024b0422d7f84d15cd794844e029ffebd4c5d42a7eb9b936b458697ef550a02c"}, + {file = "gitpython-3.1.49.tar.gz", hash = "sha256:42f9399c9eb33fc581014bedd76049dfbaf6375aa2a5754575966387280315e1"}, +] + +[package.dependencies] +gitdb = ">=4.0.1,<5" + +[package.extras] +doc = ["sphinx (>=7.4.7,<8)", "sphinx-autodoc-typehints", "sphinx_rtd_theme"] +test = ["coverage[toml]", "ddt (>=1.1.1,!=1.4.3)", "mock ; python_version < \"3.8\"", "mypy (==1.18.2) ; python_version >= \"3.9\"", "pre-commit", "pytest (>=7.3.1)", "pytest-cov", "pytest-instafail", "pytest-mock", "pytest-sugar", "typing-extensions ; python_version < \"3.11\""] + +[[package]] +name = "google-auth" +version = "2.50.0" +description = "Google Authentication Library" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "google_auth-2.50.0-py3-none-any.whl", hash = "sha256:04382175e28b94f49694977f0a792688b59a668def1499e9d8de996dc9ce5b15"}, + {file = "google_auth-2.50.0.tar.gz", hash = "sha256:f35eafb191195328e8ce10a7883970877e7aeb49c2bfaa54aa0e394316d353d0"}, +] + +[package.dependencies] +cryptography = ">=38.0.3" +pyasn1-modules = ">=0.2.1" + +[package.extras] +aiohttp = ["aiohttp (>=3.6.2,<4.0.0)", "requests (>=2.20.0,<3.0.0)"] +cryptography = ["cryptography (>=38.0.3)"] +enterprise-cert = ["pyopenssl"] +pyjwt = ["pyjwt (>=2.0)"] +pyopenssl = ["pyopenssl (>=20.0.0)"] +reauth = ["pyu2f (>=0.1.5)"] +requests = ["requests (>=2.20.0,<3.0.0)"] +rsa = ["rsa (>=3.1.4,<5)"] +testing = ["aiohttp (<3.10.0)", "aiohttp (>=3.6.2,<4.0.0)", "aioresponses", "flask", "freezegun", "grpcio", "packaging", "pyjwt (>=2.0)", "pyopenssl (<24.3.0)", "pyopenssl (>=20.0.0)", "pytest", "pytest-asyncio", "pytest-cov", "pytest-localserver", "pyu2f (>=0.1.5)", "requests (>=2.20.0,<3.0.0)", "responses", "urllib3"] +urllib3 = ["packaging", "urllib3"] + +[[package]] +name = "idna" +version = "3.13" +description = "Internationalized Domain Names in Applications (IDNA)" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "idna-3.13-py3-none-any.whl", hash = "sha256:892ea0cde124a99ce773decba204c5552b69c3c67ffd5f232eb7696135bc8bb3"}, + {file = "idna-3.13.tar.gz", hash = "sha256:585ea8fe5d69b9181ec1afba340451fba6ba764af97026f92a91d4eef164a242"}, +] + +[package.extras] +all = ["mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"] + +[[package]] +name = "isodate" +version = "0.7.2" +description = "An ISO 8601 date/time/duration parser and formatter" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "isodate-0.7.2-py3-none-any.whl", hash = "sha256:28009937d8031054830160fce6d409ed342816b543597cece116d966c6d99e15"}, + {file = "isodate-0.7.2.tar.gz", hash = "sha256:4cd1aa0f43ca76f4a6c6c0292a85f40b35ec2e43e315b59f06e6d32171a953e6"}, +] + +[[package]] +name = "jinja2" +version = "3.1.6" +description = "A very fast and expressive template engine." +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67"}, + {file = "jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d"}, +] + +[package.dependencies] +MarkupSafe = ">=2.0" + +[package.extras] +i18n = ["Babel (>=2.7)"] + +[[package]] +name = "jmespath" +version = "1.1.0" +description = "JSON Matching Expressions" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "jmespath-1.1.0-py3-none-any.whl", hash = "sha256:a5663118de4908c91729bea0acadca56526eb2698e83de10cd116ae0f4e97c64"}, + {file = "jmespath-1.1.0.tar.gz", hash = "sha256:472c87d80f36026ae83c6ddd0f1d05d4e510134ed462851fd5f754c8c3cbb88d"}, +] + +[[package]] +name = "jsonschema" +version = "4.26.0" +description = "An implementation of JSON Schema validation for Python" +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "jsonschema-4.26.0-py3-none-any.whl", hash = "sha256:d489f15263b8d200f8387e64b4c3a75f06629559fb73deb8fdfb525f2dab50ce"}, + {file = "jsonschema-4.26.0.tar.gz", hash = "sha256:0c26707e2efad8aa1bfc5b7ce170f3fccc2e4918ff85989ba9ffa9facb2be326"}, +] + +[package.dependencies] +attrs = ">=22.2.0" +jsonschema-specifications = ">=2023.3.6" +referencing = ">=0.28.4" +rpds-py = ">=0.25.0" + +[package.extras] +format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"] +format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "rfc3987-syntax (>=1.1.0)", "uri-template", "webcolors (>=24.6.0)"] + +[[package]] +name = "jsonschema-specifications" +version = "2025.9.1" +description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "jsonschema_specifications-2025.9.1-py3-none-any.whl", hash = "sha256:98802fee3a11ee76ecaca44429fda8a41bff98b00a0f2838151b113f210cc6fe"}, + {file = "jsonschema_specifications-2025.9.1.tar.gz", hash = "sha256:b540987f239e745613c7a9176f3edb72b832a4ac465cf02712288397832b5e8d"}, +] + +[package.dependencies] +referencing = ">=0.31.0" + +[[package]] +name = "kubernetes" +version = "34.1.0" +description = "Kubernetes python client" +optional = false +python-versions = ">=3.6" +groups = ["main"] +files = [ + {file = "kubernetes-34.1.0-py2.py3-none-any.whl", hash = "sha256:bffba2272534e224e6a7a74d582deb0b545b7c9879d2cd9e4aae9481d1f2cc2a"}, + {file = "kubernetes-34.1.0.tar.gz", hash = "sha256:8fe8edb0b5d290a2f3ac06596b23f87c658977d46b5f8df9d0f4ea83d0003912"}, +] + +[package.dependencies] +certifi = ">=14.5.14" +durationpy = ">=0.7" +google-auth = ">=1.0.1" +python-dateutil = ">=2.5.3" +pyyaml = ">=5.4.1" +requests = "*" +requests-oauthlib = "*" +six = ">=1.9.0" +urllib3 = ">=1.24.2,<2.4.0" +websocket-client = ">=0.32.0,<0.40.0 || >0.40.0,<0.41.dev0 || >=0.43.dev0" + +[package.extras] +adal = ["adal (>=1.0.2)"] + +[[package]] +name = "markupsafe" +version = "3.0.3" +description = "Safely add untrusted strings to HTML/XML markup." +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "markupsafe-3.0.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2f981d352f04553a7171b8e44369f2af4055f888dfb147d55e42d29e29e74559"}, + {file = "markupsafe-3.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e1c1493fb6e50ab01d20a22826e57520f1284df32f2d8601fdd90b6304601419"}, + {file = "markupsafe-3.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1ba88449deb3de88bd40044603fafffb7bc2b055d626a330323a9ed736661695"}, + {file = "markupsafe-3.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f42d0984e947b8adf7dd6dde396e720934d12c506ce84eea8476409563607591"}, + {file = "markupsafe-3.0.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c0c0b3ade1c0b13b936d7970b1d37a57acde9199dc2aecc4c336773e1d86049c"}, + {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0303439a41979d9e74d18ff5e2dd8c43ed6c6001fd40e5bf2e43f7bd9bbc523f"}, + {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:d2ee202e79d8ed691ceebae8e0486bd9a2cd4794cec4824e1c99b6f5009502f6"}, + {file = "markupsafe-3.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:177b5253b2834fe3678cb4a5f0059808258584c559193998be2601324fdeafb1"}, + {file = "markupsafe-3.0.3-cp310-cp310-win32.whl", hash = "sha256:2a15a08b17dd94c53a1da0438822d70ebcd13f8c3a95abe3a9ef9f11a94830aa"}, + {file = "markupsafe-3.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:c4ffb7ebf07cfe8931028e3e4c85f0357459a3f9f9490886198848f4fa002ec8"}, + {file = "markupsafe-3.0.3-cp310-cp310-win_arm64.whl", hash = "sha256:e2103a929dfa2fcaf9bb4e7c091983a49c9ac3b19c9061b6d5427dd7d14d81a1"}, + {file = "markupsafe-3.0.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cc7ea17a6824959616c525620e387f6dd30fec8cb44f649e31712db02123dad"}, + {file = "markupsafe-3.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4bd4cd07944443f5a265608cc6aab442e4f74dff8088b0dfc8238647b8f6ae9a"}, + {file = "markupsafe-3.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b5420a1d9450023228968e7e6a9ce57f65d148ab56d2313fcd589eee96a7a50"}, + {file = "markupsafe-3.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0bf2a864d67e76e5c9a34dc26ec616a66b9888e25e7b9460e1c76d3293bd9dbf"}, + {file = "markupsafe-3.0.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc51efed119bc9cfdf792cdeaa4d67e8f6fcccab66ed4bfdd6bde3e59bfcbb2f"}, + {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a"}, + {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7be7b61bb172e1ed687f1754f8e7484f1c8019780f6f6b0786e76bb01c2ae115"}, + {file = "markupsafe-3.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f9e130248f4462aaa8e2552d547f36ddadbeaa573879158d721bbd33dfe4743a"}, + {file = "markupsafe-3.0.3-cp311-cp311-win32.whl", hash = "sha256:0db14f5dafddbb6d9208827849fad01f1a2609380add406671a26386cdf15a19"}, + {file = "markupsafe-3.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:de8a88e63464af587c950061a5e6a67d3632e36df62b986892331d4620a35c01"}, + {file = "markupsafe-3.0.3-cp311-cp311-win_arm64.whl", hash = "sha256:3b562dd9e9ea93f13d53989d23a7e775fdfd1066c33494ff43f5418bc8c58a5c"}, + {file = "markupsafe-3.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e"}, + {file = "markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce"}, + {file = "markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d"}, + {file = "markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d"}, + {file = "markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a"}, + {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b"}, + {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f"}, + {file = "markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b"}, + {file = "markupsafe-3.0.3-cp312-cp312-win32.whl", hash = "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d"}, + {file = "markupsafe-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c"}, + {file = "markupsafe-3.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f"}, + {file = "markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795"}, + {file = "markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219"}, + {file = "markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6"}, + {file = "markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676"}, + {file = "markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9"}, + {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1"}, + {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc"}, + {file = "markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12"}, + {file = "markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed"}, + {file = "markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5"}, + {file = "markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485"}, + {file = "markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73"}, + {file = "markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37"}, + {file = "markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19"}, + {file = "markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025"}, + {file = "markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6"}, + {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f"}, + {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb"}, + {file = "markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009"}, + {file = "markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354"}, + {file = "markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218"}, + {file = "markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287"}, + {file = "markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe"}, + {file = "markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026"}, + {file = "markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737"}, + {file = "markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97"}, + {file = "markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d"}, + {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda"}, + {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf"}, + {file = "markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe"}, + {file = "markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9"}, + {file = "markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581"}, + {file = "markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4"}, + {file = "markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab"}, + {file = "markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175"}, + {file = "markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634"}, + {file = "markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50"}, + {file = "markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e"}, + {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5"}, + {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523"}, + {file = "markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc"}, + {file = "markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d"}, + {file = "markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9"}, + {file = "markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa"}, + {file = "markupsafe-3.0.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:15d939a21d546304880945ca1ecb8a039db6b4dc49b2c5a400387cdae6a62e26"}, + {file = "markupsafe-3.0.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f71a396b3bf33ecaa1626c255855702aca4d3d9fea5e051b41ac59a9c1c41edc"}, + {file = "markupsafe-3.0.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0f4b68347f8c5eab4a13419215bdfd7f8c9b19f2b25520968adfad23eb0ce60c"}, + {file = "markupsafe-3.0.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e8fc20152abba6b83724d7ff268c249fa196d8259ff481f3b1476383f8f24e42"}, + {file = "markupsafe-3.0.3-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:949b8d66bc381ee8b007cd945914c721d9aba8e27f71959d750a46f7c282b20b"}, + {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:3537e01efc9d4dccdf77221fb1cb3b8e1a38d5428920e0657ce299b20324d758"}, + {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:591ae9f2a647529ca990bc681daebdd52c8791ff06c2bfa05b65163e28102ef2"}, + {file = "markupsafe-3.0.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:a320721ab5a1aba0a233739394eb907f8c8da5c98c9181d1161e77a0c8e36f2d"}, + {file = "markupsafe-3.0.3-cp39-cp39-win32.whl", hash = "sha256:df2449253ef108a379b8b5d6b43f4b1a8e81a061d6537becd5582fba5f9196d7"}, + {file = "markupsafe-3.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:7c3fb7d25180895632e5d3148dbdc29ea38ccb7fd210aa27acbd1201a1902c6e"}, + {file = "markupsafe-3.0.3-cp39-cp39-win_arm64.whl", hash = "sha256:38664109c14ffc9e7437e86b4dceb442b0096dfe3541d7864d9cbe1da4cf36c8"}, + {file = "markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698"}, +] + +[[package]] +name = "msal" +version = "1.36.0" +description = "The Microsoft Authentication Library (MSAL) for Python library enables your app to access the Microsoft Cloud by supporting authentication of users with Microsoft Azure Active Directory accounts (AAD) and Microsoft Accounts (MSA) using industry standard OAuth2 and OpenID Connect." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "msal-1.36.0-py3-none-any.whl", hash = "sha256:36ecac30e2ff4322d956029aabce3c82301c29f0acb1ad89b94edcabb0e58ec4"}, + {file = "msal-1.36.0.tar.gz", hash = "sha256:3f6a4af2b036b476a4215111c4297b4e6e236ed186cd804faefba23e4990978b"}, +] + +[package.dependencies] +cryptography = ">=2.5,<49" +PyJWT = {version = ">=1.0.0,<3", extras = ["crypto"]} +requests = ">=2.0.0,<3" + +[package.extras] +broker = ["pymsalruntime (>=0.14,<0.21) ; python_version >= \"3.8\" and platform_system == \"Windows\"", "pymsalruntime (>=0.17,<0.21) ; python_version >= \"3.8\" and platform_system == \"Darwin\"", "pymsalruntime (>=0.18,<0.21) ; python_version >= \"3.8\" and platform_system == \"Linux\""] + +[[package]] +name = "msal-extensions" +version = "1.3.1" +description = "Microsoft Authentication Library extensions (MSAL EX) provides a persistence API that can save your data on disk, encrypted on Windows, macOS and Linux. Concurrent data access will be coordinated by a file lock mechanism." +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "msal_extensions-1.3.1-py3-none-any.whl", hash = "sha256:96d3de4d034504e969ac5e85bae8106c8373b5c6568e4c8fa7af2eca9dbe6bca"}, + {file = "msal_extensions-1.3.1.tar.gz", hash = "sha256:c5b0fd10f65ef62b5f1d62f4251d51cbcaf003fcedae8c91b040a488614be1a4"}, +] + +[package.dependencies] +msal = ">=1.29,<2" + +[package.extras] +portalocker = ["portalocker (>=1.4,<4)"] + +[[package]] +name = "msrest" +version = "0.7.1" +description = "AutoRest swagger generator Python client runtime." +optional = false +python-versions = ">=3.6" +groups = ["main"] +files = [ + {file = "msrest-0.7.1-py3-none-any.whl", hash = "sha256:21120a810e1233e5e6cc7fe40b474eeb4ec6f757a15d7cf86702c369f9567c32"}, + {file = "msrest-0.7.1.zip", hash = "sha256:6e7661f46f3afd88b75667b7187a92829924446c7ea1d169be8c4bb7eeb788b9"}, +] + +[package.dependencies] +azure-core = ">=1.24.0" +certifi = ">=2017.4.17" +isodate = ">=0.6.0" +requests = ">=2.16,<3.0" +requests-oauthlib = ">=0.5.0" + +[package.extras] +async = ["aiodns ; python_version >= \"3.5\"", "aiohttp (>=3.0) ; python_version >= \"3.5\""] + +[[package]] +name = "oauthlib" +version = "3.3.1" +description = "A generic, spec-compliant, thorough implementation of the OAuth request-signing logic" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "oauthlib-3.3.1-py3-none-any.whl", hash = "sha256:88119c938d2b8fb88561af5f6ee0eec8cc8d552b7bb1f712743136eb7523b7a1"}, + {file = "oauthlib-3.3.1.tar.gz", hash = "sha256:0f0f8aa759826a193cf66c12ea1af1637f87b9b4622d46e866952bb022e538c9"}, +] + +[package.extras] +rsa = ["cryptography (>=3.0.0)"] +signals = ["blinker (>=1.4.0)"] +signedtoken = ["cryptography (>=3.0.0)", "pyjwt (>=2.0.0,<3)"] + +[[package]] +name = "pyasn1" +version = "0.6.3" +description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "pyasn1-0.6.3-py3-none-any.whl", hash = "sha256:a80184d120f0864a52a073acc6fc642847d0be408e7c7252f31390c0f4eadcde"}, + {file = "pyasn1-0.6.3.tar.gz", hash = "sha256:697a8ecd6d98891189184ca1fa05d1bb00e2f84b5977c481452050549c8a72cf"}, +] + +[[package]] +name = "pyasn1-modules" +version = "0.4.2" +description = "A collection of ASN.1-based protocols modules" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "pyasn1_modules-0.4.2-py3-none-any.whl", hash = "sha256:29253a9207ce32b64c3ac6600edc75368f98473906e8fd1043bd6b5b1de2c14a"}, + {file = "pyasn1_modules-0.4.2.tar.gz", hash = "sha256:677091de870a80aae844b1ca6134f54652fa2c8c5a52aa396440ac3106e941e6"}, +] + +[package.dependencies] +pyasn1 = ">=0.6.1,<0.7.0" + +[[package]] +name = "pycparser" +version = "3.0" +description = "C parser in Python" +optional = false +python-versions = ">=3.10" +groups = ["main"] +markers = "platform_python_implementation != \"PyPy\" and implementation_name != \"PyPy\"" +files = [ + {file = "pycparser-3.0-py3-none-any.whl", hash = "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992"}, + {file = "pycparser-3.0.tar.gz", hash = "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29"}, +] + +[[package]] +name = "pyjwt" +version = "2.12.1" +description = "JSON Web Token implementation in Python" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "pyjwt-2.12.1-py3-none-any.whl", hash = "sha256:28ca37c070cad8ba8cd9790cd940535d40274d22f80ab87f3ac6a713e6e8454c"}, + {file = "pyjwt-2.12.1.tar.gz", hash = "sha256:c74a7a2adf861c04d002db713dd85f84beb242228e671280bf709d765b03672b"}, +] + +[package.dependencies] +cryptography = {version = ">=3.4.0", optional = true, markers = "extra == \"crypto\""} + +[package.extras] +crypto = ["cryptography (>=3.4.0)"] +dev = ["coverage[toml] (==7.10.7)", "cryptography (>=3.4.0)", "pre-commit", "pytest (>=8.4.2,<9.0.0)", "sphinx", "sphinx-rtd-theme", "zope.interface"] +docs = ["sphinx", "sphinx-rtd-theme", "zope.interface"] +tests = ["coverage[toml] (==7.10.7)", "pytest (>=8.4.2,<9.0.0)"] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +description = "Extensions to the standard Python datetime module" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["main"] +files = [ + {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"}, + {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"}, +] + +[package.dependencies] +six = ">=1.5" + +[[package]] +name = "pyyaml" +version = "6.0.3" +description = "YAML parser and emitter for Python" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "PyYAML-6.0.3-cp38-cp38-macosx_10_13_x86_64.whl", hash = "sha256:c2514fceb77bc5e7a2f7adfaa1feb2fb311607c9cb518dbc378688ec73d8292f"}, + {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9c57bb8c96f6d1808c030b1687b9b5fb476abaa47f0db9c0101f5e9f394e97f4"}, + {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:efd7b85f94a6f21e4932043973a7ba2613b059c4a000551892ac9f1d11f5baf3"}, + {file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:22ba7cfcad58ef3ecddc7ed1db3409af68d023b7f940da23c6c2a1890976eda6"}, + {file = "PyYAML-6.0.3-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:6344df0d5755a2c9a276d4473ae6b90647e216ab4757f8426893b5dd2ac3f369"}, + {file = "PyYAML-6.0.3-cp38-cp38-win32.whl", hash = "sha256:3ff07ec89bae51176c0549bc4c63aa6202991da2d9a6129d7aef7f1407d3f295"}, + {file = "PyYAML-6.0.3-cp38-cp38-win_amd64.whl", hash = "sha256:5cf4e27da7e3fbed4d6c3d8e797387aaad68102272f8f9752883bc32d61cb87b"}, + {file = "pyyaml-6.0.3-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:214ed4befebe12df36bcc8bc2b64b396ca31be9304b8f59e25c11cf94a4c033b"}, + {file = "pyyaml-6.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:02ea2dfa234451bbb8772601d7b8e426c2bfa197136796224e50e35a78777956"}, + {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b30236e45cf30d2b8e7b3e85881719e98507abed1011bf463a8fa23e9c3e98a8"}, + {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:66291b10affd76d76f54fad28e22e51719ef9ba22b29e1d7d03d6777a9174198"}, + {file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9c7708761fccb9397fe64bbc0395abcae8c4bf7b0eac081e12b809bf47700d0b"}, + {file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:418cf3f2111bc80e0933b2cd8cd04f286338bb88bdc7bc8e6dd775ebde60b5e0"}, + {file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5e0b74767e5f8c593e8c9b5912019159ed0533c70051e9cce3e8b6aa699fcd69"}, + {file = "pyyaml-6.0.3-cp310-cp310-win32.whl", hash = "sha256:28c8d926f98f432f88adc23edf2e6d4921ac26fb084b028c733d01868d19007e"}, + {file = "pyyaml-6.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:bdb2c67c6c1390b63c6ff89f210c8fd09d9a1217a465701eac7316313c915e4c"}, + {file = "pyyaml-6.0.3-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:44edc647873928551a01e7a563d7452ccdebee747728c1080d881d68af7b997e"}, + {file = "pyyaml-6.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:652cb6edd41e718550aad172851962662ff2681490a8a711af6a4d288dd96824"}, + {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:10892704fc220243f5305762e276552a0395f7beb4dbf9b14ec8fd43b57f126c"}, + {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:850774a7879607d3a6f50d36d04f00ee69e7fc816450e5f7e58d7f17f1ae5c00"}, + {file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8bb0864c5a28024fac8a632c443c87c5aa6f215c0b126c449ae1a150412f31d"}, + {file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1d37d57ad971609cf3c53ba6a7e365e40660e3be0e5175fa9f2365a379d6095a"}, + {file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:37503bfbfc9d2c40b344d06b2199cf0e96e97957ab1c1b546fd4f87e53e5d3e4"}, + {file = "pyyaml-6.0.3-cp311-cp311-win32.whl", hash = "sha256:8098f252adfa6c80ab48096053f512f2321f0b998f98150cea9bd23d83e1467b"}, + {file = "pyyaml-6.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:9f3bfb4965eb874431221a3ff3fdcddc7e74e3b07799e0e84ca4a0f867d449bf"}, + {file = "pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196"}, + {file = "pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0"}, + {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28"}, + {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c"}, + {file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc"}, + {file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e"}, + {file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea"}, + {file = "pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5"}, + {file = "pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b"}, + {file = "pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd"}, + {file = "pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8"}, + {file = "pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1"}, + {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c"}, + {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5"}, + {file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6"}, + {file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6"}, + {file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be"}, + {file = "pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26"}, + {file = "pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c"}, + {file = "pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb"}, + {file = "pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac"}, + {file = "pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310"}, + {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7"}, + {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788"}, + {file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5"}, + {file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764"}, + {file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35"}, + {file = "pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac"}, + {file = "pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3"}, + {file = "pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3"}, + {file = "pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba"}, + {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c"}, + {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702"}, + {file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c"}, + {file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065"}, + {file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65"}, + {file = "pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9"}, + {file = "pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b"}, + {file = "pyyaml-6.0.3-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:b865addae83924361678b652338317d1bd7e79b1f4596f96b96c77a5a34b34da"}, + {file = "pyyaml-6.0.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c3355370a2c156cffb25e876646f149d5d68f5e0a3ce86a5084dd0b64a994917"}, + {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3c5677e12444c15717b902a5798264fa7909e41153cdf9ef7ad571b704a63dd9"}, + {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5ed875a24292240029e4483f9d4a4b8a1ae08843b9c54f43fcc11e404532a8a5"}, + {file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0150219816b6a1fa26fb4699fb7daa9caf09eb1999f3b70fb6e786805e80375a"}, + {file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:fa160448684b4e94d80416c0fa4aac48967a969efe22931448d853ada8baf926"}, + {file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:27c0abcb4a5dac13684a37f76e701e054692a9b2d3064b70f5e4eb54810553d7"}, + {file = "pyyaml-6.0.3-cp39-cp39-win32.whl", hash = "sha256:1ebe39cb5fc479422b83de611d14e2c0d3bb2a18bbcb01f229ab3cfbd8fee7a0"}, + {file = "pyyaml-6.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:2e71d11abed7344e42a8849600193d15b6def118602c4c176f748e4583246007"}, + {file = "pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f"}, +] + +[[package]] +name = "referencing" +version = "0.37.0" +description = "JSON Referencing + Python" +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "referencing-0.37.0-py3-none-any.whl", hash = "sha256:381329a9f99628c9069361716891d34ad94af76e461dcb0335825aecc7692231"}, + {file = "referencing-0.37.0.tar.gz", hash = "sha256:44aefc3142c5b842538163acb373e24cce6632bd54bdb01b21ad5863489f50d8"}, +] + +[package.dependencies] +attrs = ">=22.2.0" +rpds-py = ">=0.7.0" + +[[package]] +name = "requests" +version = "2.33.1" +description = "Python HTTP for Humans." +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "requests-2.33.1-py3-none-any.whl", hash = "sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a"}, + {file = "requests-2.33.1.tar.gz", hash = "sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517"}, +] + +[package.dependencies] +certifi = ">=2023.5.7" +charset_normalizer = ">=2,<4" +idna = ">=2.5,<4" +urllib3 = ">=1.26,<3" + +[package.extras] +socks = ["PySocks (>=1.5.6,!=1.5.7)"] +use-chardet-on-py3 = ["chardet (>=3.0.2,<8)"] + +[[package]] +name = "requests-oauthlib" +version = "2.0.0" +description = "OAuthlib authentication support for Requests." +optional = false +python-versions = ">=3.4" +groups = ["main"] +files = [ + {file = "requests-oauthlib-2.0.0.tar.gz", hash = "sha256:b3dffaebd884d8cd778494369603a9e7b58d29111bf6b41bdc2dcd87203af4e9"}, + {file = "requests_oauthlib-2.0.0-py2.py3-none-any.whl", hash = "sha256:7dd8a5c40426b779b0868c404bdef9768deccf22749cde15852df527e6269b36"}, +] + +[package.dependencies] +oauthlib = ">=3.0.0" +requests = ">=2.0.0" + +[package.extras] +rsa = ["oauthlib[signedtoken] (>=3.0.0)"] + +[[package]] +name = "robotframework" +version = "7.3.2" +description = "Generic automation framework for acceptance testing and robotic process automation (RPA)" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "robotframework-7.3.2-py3-none-any.whl", hash = "sha256:14ef2afa905285cc073df6ce06d0cd3af4a113df6f815532718079e00c98cca4"}, + {file = "robotframework-7.3.2.tar.gz", hash = "sha256:3bb3e299831ecb1664f3d5082f6ff9f08ba82d61a745bef2227328ef3049e93a"}, +] + +[[package]] +name = "rpds-py" +version = "0.30.0" +description = "Python bindings to Rust's persistent data structures (rpds)" +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "rpds_py-0.30.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:679ae98e00c0e8d68a7fda324e16b90fd5260945b45d3b824c892cec9eea3288"}, + {file = "rpds_py-0.30.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4cc2206b76b4f576934f0ed374b10d7ca5f457858b157ca52064bdfc26b9fc00"}, + {file = "rpds_py-0.30.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:389a2d49eded1896c3d48b0136ead37c48e221b391c052fba3f4055c367f60a6"}, + {file = "rpds_py-0.30.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:32c8528634e1bf7121f3de08fa85b138f4e0dc47657866630611b03967f041d7"}, + {file = "rpds_py-0.30.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f207f69853edd6f6700b86efb84999651baf3789e78a466431df1331608e5324"}, + {file = "rpds_py-0.30.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:67b02ec25ba7a9e8fa74c63b6ca44cf5707f2fbfadae3ee8e7494297d56aa9df"}, + {file = "rpds_py-0.30.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c0e95f6819a19965ff420f65578bacb0b00f251fefe2c8b23347c37174271f3"}, + {file = "rpds_py-0.30.0-cp310-cp310-manylinux_2_31_riscv64.whl", hash = "sha256:a452763cc5198f2f98898eb98f7569649fe5da666c2dc6b5ddb10fde5a574221"}, + {file = "rpds_py-0.30.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e0b65193a413ccc930671c55153a03ee57cecb49e6227204b04fae512eb657a7"}, + {file = "rpds_py-0.30.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:858738e9c32147f78b3ac24dc0edb6610000e56dc0f700fd5f651d0a0f0eb9ff"}, + {file = "rpds_py-0.30.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:da279aa314f00acbb803da1e76fa18666778e8a8f83484fba94526da5de2cba7"}, + {file = "rpds_py-0.30.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:7c64d38fb49b6cdeda16ab49e35fe0da2e1e9b34bc38bd78386530f218b37139"}, + {file = "rpds_py-0.30.0-cp310-cp310-win32.whl", hash = "sha256:6de2a32a1665b93233cde140ff8b3467bdb9e2af2b91079f0333a0974d12d464"}, + {file = "rpds_py-0.30.0-cp310-cp310-win_amd64.whl", hash = "sha256:1726859cd0de969f88dc8673bdd954185b9104e05806be64bcd87badbe313169"}, + {file = "rpds_py-0.30.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:a2bffea6a4ca9f01b3f8e548302470306689684e61602aa3d141e34da06cf425"}, + {file = "rpds_py-0.30.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dc4f992dfe1e2bc3ebc7444f6c7051b4bc13cd8e33e43511e8ffd13bf407010d"}, + {file = "rpds_py-0.30.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:422c3cb9856d80b09d30d2eb255d0754b23e090034e1deb4083f8004bd0761e4"}, + {file = "rpds_py-0.30.0-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:07ae8a593e1c3c6b82ca3292efbe73c30b61332fd612e05abee07c79359f292f"}, + {file = "rpds_py-0.30.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:12f90dd7557b6bd57f40abe7747e81e0c0b119bef015ea7726e69fe550e394a4"}, + {file = "rpds_py-0.30.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:99b47d6ad9a6da00bec6aabe5a6279ecd3c06a329d4aa4771034a21e335c3a97"}, + {file = "rpds_py-0.30.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:33f559f3104504506a44bb666b93a33f5d33133765b0c216a5bf2f1e1503af89"}, + {file = "rpds_py-0.30.0-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:946fe926af6e44f3697abbc305ea168c2c31d3e3ef1058cf68f379bf0335a78d"}, + {file = "rpds_py-0.30.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:495aeca4b93d465efde585977365187149e75383ad2684f81519f504f5c13038"}, + {file = "rpds_py-0.30.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d9a0ca5da0386dee0655b4ccdf46119df60e0f10da268d04fe7cc87886872ba7"}, + {file = "rpds_py-0.30.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:8d6d1cc13664ec13c1b84241204ff3b12f9bb82464b8ad6e7a5d3486975c2eed"}, + {file = "rpds_py-0.30.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:3896fa1be39912cf0757753826bc8bdc8ca331a28a7c4ae46b7a21280b06bb85"}, + {file = "rpds_py-0.30.0-cp311-cp311-win32.whl", hash = "sha256:55f66022632205940f1827effeff17c4fa7ae1953d2b74a8581baaefb7d16f8c"}, + {file = "rpds_py-0.30.0-cp311-cp311-win_amd64.whl", hash = "sha256:a51033ff701fca756439d641c0ad09a41d9242fa69121c7d8769604a0a629825"}, + {file = "rpds_py-0.30.0-cp311-cp311-win_arm64.whl", hash = "sha256:47b0ef6231c58f506ef0b74d44e330405caa8428e770fec25329ed2cb971a229"}, + {file = "rpds_py-0.30.0-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a161f20d9a43006833cd7068375a94d035714d73a172b681d8881820600abfad"}, + {file = "rpds_py-0.30.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6abc8880d9d036ecaafe709079969f56e876fcf107f7a8e9920ba6d5a3878d05"}, + {file = "rpds_py-0.30.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca28829ae5f5d569bb62a79512c842a03a12576375d5ece7d2cadf8abe96ec28"}, + {file = "rpds_py-0.30.0-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a1010ed9524c73b94d15919ca4d41d8780980e1765babf85f9a2f90d247153dd"}, + {file = "rpds_py-0.30.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f8d1736cfb49381ba528cd5baa46f82fdc65c06e843dab24dd70b63d09121b3f"}, + {file = "rpds_py-0.30.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d948b135c4693daff7bc2dcfc4ec57237a29bd37e60c2fabf5aff2bbacf3e2f1"}, + {file = "rpds_py-0.30.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47f236970bccb2233267d89173d3ad2703cd36a0e2a6e92d0560d333871a3d23"}, + {file = "rpds_py-0.30.0-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:2e6ecb5a5bcacf59c3f912155044479af1d0b6681280048b338b28e364aca1f6"}, + {file = "rpds_py-0.30.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a8fa71a2e078c527c3e9dc9fc5a98c9db40bcc8a92b4e8858e36d329f8684b51"}, + {file = "rpds_py-0.30.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:73c67f2db7bc334e518d097c6d1e6fed021bbc9b7d678d6cc433478365d1d5f5"}, + {file = "rpds_py-0.30.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:5ba103fb455be00f3b1c2076c9d4264bfcb037c976167a6047ed82f23153f02e"}, + {file = "rpds_py-0.30.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:7cee9c752c0364588353e627da8a7e808a66873672bcb5f52890c33fd965b394"}, + {file = "rpds_py-0.30.0-cp312-cp312-win32.whl", hash = "sha256:1ab5b83dbcf55acc8b08fc62b796ef672c457b17dbd7820a11d6c52c06839bdf"}, + {file = "rpds_py-0.30.0-cp312-cp312-win_amd64.whl", hash = "sha256:a090322ca841abd453d43456ac34db46e8b05fd9b3b4ac0c78bcde8b089f959b"}, + {file = "rpds_py-0.30.0-cp312-cp312-win_arm64.whl", hash = "sha256:669b1805bd639dd2989b281be2cfd951c6121b65e729d9b843e9639ef1fd555e"}, + {file = "rpds_py-0.30.0-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:f83424d738204d9770830d35290ff3273fbb02b41f919870479fab14b9d303b2"}, + {file = "rpds_py-0.30.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e7536cd91353c5273434b4e003cbda89034d67e7710eab8761fd918ec6c69cf8"}, + {file = "rpds_py-0.30.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2771c6c15973347f50fece41fc447c054b7ac2ae0502388ce3b6738cd366e3d4"}, + {file = "rpds_py-0.30.0-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0a59119fc6e3f460315fe9d08149f8102aa322299deaa5cab5b40092345c2136"}, + {file = "rpds_py-0.30.0-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:76fec018282b4ead0364022e3c54b60bf368b9d926877957a8624b58419169b7"}, + {file = "rpds_py-0.30.0-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:692bef75a5525db97318e8cd061542b5a79812d711ea03dbc1f6f8dbb0c5f0d2"}, + {file = "rpds_py-0.30.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9027da1ce107104c50c81383cae773ef5c24d296dd11c99e2629dbd7967a20c6"}, + {file = "rpds_py-0.30.0-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:9cf69cdda1f5968a30a359aba2f7f9aa648a9ce4b580d6826437f2b291cfc86e"}, + {file = "rpds_py-0.30.0-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a4796a717bf12b9da9d3ad002519a86063dcac8988b030e405704ef7d74d2d9d"}, + {file = "rpds_py-0.30.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5d4c2aa7c50ad4728a094ebd5eb46c452e9cb7edbfdb18f9e1221f597a73e1e7"}, + {file = "rpds_py-0.30.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ba81a9203d07805435eb06f536d95a266c21e5b2dfbf6517748ca40c98d19e31"}, + {file = "rpds_py-0.30.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:945dccface01af02675628334f7cf49c2af4c1c904748efc5cf7bbdf0b579f95"}, + {file = "rpds_py-0.30.0-cp313-cp313-win32.whl", hash = "sha256:b40fb160a2db369a194cb27943582b38f79fc4887291417685f3ad693c5a1d5d"}, + {file = "rpds_py-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:806f36b1b605e2d6a72716f321f20036b9489d29c51c91f4dd29a3e3afb73b15"}, + {file = "rpds_py-0.30.0-cp313-cp313-win_arm64.whl", hash = "sha256:d96c2086587c7c30d44f31f42eae4eac89b60dabbac18c7669be3700f13c3ce1"}, + {file = "rpds_py-0.30.0-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:eb0b93f2e5c2189ee831ee43f156ed34e2a89a78a66b98cadad955972548be5a"}, + {file = "rpds_py-0.30.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:922e10f31f303c7c920da8981051ff6d8c1a56207dbdf330d9047f6d30b70e5e"}, + {file = "rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cdc62c8286ba9bf7f47befdcea13ea0e26bf294bda99758fd90535cbaf408000"}, + {file = "rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:47f9a91efc418b54fb8190a6b4aa7813a23fb79c51f4bb84e418f5476c38b8db"}, + {file = "rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1f3587eb9b17f3789ad50824084fa6f81921bbf9a795826570bda82cb3ed91f2"}, + {file = "rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:39c02563fc592411c2c61d26b6c5fe1e51eaa44a75aa2c8735ca88b0d9599daa"}, + {file = "rpds_py-0.30.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:51a1234d8febafdfd33a42d97da7a43f5dcb120c1060e352a3fbc0c6d36e2083"}, + {file = "rpds_py-0.30.0-cp313-cp313t-manylinux_2_31_riscv64.whl", hash = "sha256:eb2c4071ab598733724c08221091e8d80e89064cd472819285a9ab0f24bcedb9"}, + {file = "rpds_py-0.30.0-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6bdfdb946967d816e6adf9a3d8201bfad269c67efe6cefd7093ef959683c8de0"}, + {file = "rpds_py-0.30.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:c77afbd5f5250bf27bf516c7c4a016813eb2d3e116139aed0096940c5982da94"}, + {file = "rpds_py-0.30.0-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:61046904275472a76c8c90c9ccee9013d70a6d0f73eecefd38c1ae7c39045a08"}, + {file = "rpds_py-0.30.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4c5f36a861bc4b7da6516dbdf302c55313afa09b81931e8280361a4f6c9a2d27"}, + {file = "rpds_py-0.30.0-cp313-cp313t-win32.whl", hash = "sha256:3d4a69de7a3e50ffc214ae16d79d8fbb0922972da0356dcf4d0fdca2878559c6"}, + {file = "rpds_py-0.30.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f14fc5df50a716f7ece6a80b6c78bb35ea2ca47c499e422aa4463455dd96d56d"}, + {file = "rpds_py-0.30.0-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:68f19c879420aa08f61203801423f6cd5ac5f0ac4ac82a2368a9fcd6a9a075e0"}, + {file = "rpds_py-0.30.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ec7c4490c672c1a0389d319b3a9cfcd098dcdc4783991553c332a15acf7249be"}, + {file = "rpds_py-0.30.0-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f251c812357a3fed308d684a5079ddfb9d933860fc6de89f2b7ab00da481e65f"}, + {file = "rpds_py-0.30.0-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ac98b175585ecf4c0348fd7b29c3864bda53b805c773cbf7bfdaffc8070c976f"}, + {file = "rpds_py-0.30.0-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3e62880792319dbeb7eb866547f2e35973289e7d5696c6e295476448f5b63c87"}, + {file = "rpds_py-0.30.0-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4e7fc54e0900ab35d041b0601431b0a0eb495f0851a0639b6ef90f7741b39a18"}, + {file = "rpds_py-0.30.0-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47e77dc9822d3ad616c3d5759ea5631a75e5809d5a28707744ef79d7a1bcfcad"}, + {file = "rpds_py-0.30.0-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:b4dc1a6ff022ff85ecafef7979a2c6eb423430e05f1165d6688234e62ba99a07"}, + {file = "rpds_py-0.30.0-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4559c972db3a360808309e06a74628b95eaccbf961c335c8fe0d590cf587456f"}, + {file = "rpds_py-0.30.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0ed177ed9bded28f8deb6ab40c183cd1192aa0de40c12f38be4d59cd33cb5c65"}, + {file = "rpds_py-0.30.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:ad1fa8db769b76ea911cb4e10f049d80bf518c104f15b3edb2371cc65375c46f"}, + {file = "rpds_py-0.30.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:46e83c697b1f1c72b50e5ee5adb4353eef7406fb3f2043d64c33f20ad1c2fc53"}, + {file = "rpds_py-0.30.0-cp314-cp314-win32.whl", hash = "sha256:ee454b2a007d57363c2dfd5b6ca4a5d7e2c518938f8ed3b706e37e5d470801ed"}, + {file = "rpds_py-0.30.0-cp314-cp314-win_amd64.whl", hash = "sha256:95f0802447ac2d10bcc69f6dc28fe95fdf17940367b21d34e34c737870758950"}, + {file = "rpds_py-0.30.0-cp314-cp314-win_arm64.whl", hash = "sha256:613aa4771c99f03346e54c3f038e4cc574ac09a3ddfb0e8878487335e96dead6"}, + {file = "rpds_py-0.30.0-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:7e6ecfcb62edfd632e56983964e6884851786443739dbfe3582947e87274f7cb"}, + {file = "rpds_py-0.30.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:a1d0bc22a7cdc173fedebb73ef81e07faef93692b8c1ad3733b67e31e1b6e1b8"}, + {file = "rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0d08f00679177226c4cb8c5265012eea897c8ca3b93f429e546600c971bcbae7"}, + {file = "rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5965af57d5848192c13534f90f9dd16464f3c37aaf166cc1da1cae1fd5a34898"}, + {file = "rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9a4e86e34e9ab6b667c27f3211ca48f73dba7cd3d90f8d5b11be56e5dbc3fb4e"}, + {file = "rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e5d3e6b26f2c785d65cc25ef1e5267ccbe1b069c5c21b8cc724efee290554419"}, + {file = "rpds_py-0.30.0-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:626a7433c34566535b6e56a1b39a7b17ba961e97ce3b80ec62e6f1312c025551"}, + {file = "rpds_py-0.30.0-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:acd7eb3f4471577b9b5a41baf02a978e8bdeb08b4b355273994f8b87032000a8"}, + {file = "rpds_py-0.30.0-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fe5fa731a1fa8a0a56b0977413f8cacac1768dad38d16b3a296712709476fbd5"}, + {file = "rpds_py-0.30.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:74a3243a411126362712ee1524dfc90c650a503502f135d54d1b352bd01f2404"}, + {file = "rpds_py-0.30.0-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:3e8eeb0544f2eb0d2581774be4c3410356eba189529a6b3e36bbbf9696175856"}, + {file = "rpds_py-0.30.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:dbd936cde57abfee19ab3213cf9c26be06d60750e60a8e4dd85d1ab12c8b1f40"}, + {file = "rpds_py-0.30.0-cp314-cp314t-win32.whl", hash = "sha256:dc824125c72246d924f7f796b4f63c1e9dc810c7d9e2355864b3c3a73d59ade0"}, + {file = "rpds_py-0.30.0-cp314-cp314t-win_amd64.whl", hash = "sha256:27f4b0e92de5bfbc6f86e43959e6edd1425c33b5e69aab0984a72047f2bcf1e3"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:c2262bdba0ad4fc6fb5545660673925c2d2a5d9e2e0fb603aad545427be0fc58"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:ee6af14263f25eedc3bb918a3c04245106a42dfd4f5c2285ea6f997b1fc3f89a"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3adbb8179ce342d235c31ab8ec511e66c73faa27a47e076ccc92421add53e2bb"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:250fa00e9543ac9b97ac258bd37367ff5256666122c2d0f2bc97577c60a1818c"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9854cf4f488b3d57b9aaeb105f06d78e5529d3145b1e4a41750167e8c213c6d3"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:993914b8e560023bc0a8bf742c5f303551992dcb85e247b1e5c7f4a7d145bda5"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58edca431fb9b29950807e301826586e5bbf24163677732429770a697ffe6738"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_31_riscv64.whl", hash = "sha256:dea5b552272a944763b34394d04577cf0f9bd013207bc32323b5a89a53cf9c2f"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ba3af48635eb83d03f6c9735dfb21785303e73d22ad03d489e88adae6eab8877"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:dff13836529b921e22f15cb099751209a60009731a68519630a24d61f0b1b30a"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-musllinux_1_2_i686.whl", hash = "sha256:1b151685b23929ab7beec71080a8889d4d6d9fa9a983d213f07121205d48e2c4"}, + {file = "rpds_py-0.30.0-pp311-pypy311_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:ac37f9f516c51e5753f27dfdef11a88330f04de2d564be3991384b2f3535d02e"}, + {file = "rpds_py-0.30.0.tar.gz", hash = "sha256:dd8ff7cf90014af0c0f787eea34794ebf6415242ee1d6fa91eaba725cc441e84"}, +] + +[[package]] +name = "s3transfer" +version = "0.17.0" +description = "An Amazon S3 Transfer Manager" +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "s3transfer-0.17.0-py3-none-any.whl", hash = "sha256:ce3801712acf4ad3e89fb9990df97b4972e93f4b3b0004d214be5bce12814c20"}, + {file = "s3transfer-0.17.0.tar.gz", hash = "sha256:9edeb6d1c3c2f89d6050348548834ad8289610d886e5bf7b7207728bd43ce33a"}, +] + +[package.dependencies] +botocore = ">=1.37.4,<2.0a0" + +[package.extras] +crt = ["botocore[crt] (>=1.37.4,<2.0a0)"] + +[[package]] +name = "six" +version = "1.17.0" +description = "Python 2 and 3 compatibility utilities" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["main"] +files = [ + {file = "six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274"}, + {file = "six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"}, +] + +[[package]] +name = "smmap" +version = "5.0.3" +description = "A pure Python implementation of a sliding window memory map manager" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "smmap-5.0.3-py3-none-any.whl", hash = "sha256:c106e05d5a61449cf6ba9a1e650227ecfb141590d2a98412103ff35d89fc7b2f"}, + {file = "smmap-5.0.3.tar.gz", hash = "sha256:4d9debb8b99007ae47165abc08670bd74cb74b5227dda7f643eccc4e9eb5642c"}, +] + +[[package]] +name = "sqlparse" +version = "0.5.5" +description = "A non-validating SQL parser." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "sqlparse-0.5.5-py3-none-any.whl", hash = "sha256:12a08b3bf3eec877c519589833aed092e2444e68240a3577e8e26148acc7b1ba"}, + {file = "sqlparse-0.5.5.tar.gz", hash = "sha256:e20d4a9b0b8585fdf63b10d30066c7c94c5d7a7ec47c889a2d83a3caa93ff28e"}, +] + +[package.extras] +dev = ["build"] +doc = ["sphinx"] + +[[package]] +name = "typing-extensions" +version = "4.15.0" +description = "Backported and Experimental Type Hints for Python 3.9+" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548"}, + {file = "typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466"}, +] + +[[package]] +name = "tzdata" +version = "2026.2" +description = "Provider of IANA time zone data" +optional = false +python-versions = ">=2" +groups = ["main"] +markers = "sys_platform == \"win32\"" +files = [ + {file = "tzdata-2026.2-py2.py3-none-any.whl", hash = "sha256:bbe9af844f658da81a5f95019480da3a89415801f6cc966806612cc7169bffe7"}, + {file = "tzdata-2026.2.tar.gz", hash = "sha256:9173fde7d80d9018e02a662e168e5a2d04f87c41ea174b139fbef642eda62d10"}, +] + +[[package]] +name = "urllib3" +version = "2.3.0" +description = "HTTP library with thread-safe connection pooling, file post, and more." +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df"}, + {file = "urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d"}, +] + +[package.extras] +brotli = ["brotli (>=1.0.9) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; platform_python_implementation != \"CPython\""] +h2 = ["h2 (>=4,<5)"] +socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] +zstd = ["zstandard (>=0.18.0)"] + +[[package]] +name = "websocket-client" +version = "1.9.0" +description = "WebSocket client for Python with low level API options" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "websocket_client-1.9.0-py3-none-any.whl", hash = "sha256:af248a825037ef591efbf6ed20cc5faa03d3b47b9e5a2230a529eeee1c1fc3ef"}, + {file = "websocket_client-1.9.0.tar.gz", hash = "sha256:9e813624b6eb619999a97dc7958469217c3176312b3a16a4bd1bc7e08a46ec98"}, +] + +[package.extras] +docs = ["Sphinx (>=6.0)", "myst-parser (>=2.0.0)", "sphinx_rtd_theme (>=1.1.0)"] +optional = ["python-socks", "wsaccel"] +test = ["pytest", "websockets"] + +[metadata] +lock-version = "2.1" +python-versions = ">=3.14,<3.15" +content-hash = "afb68fbeabb5c2007a80e53a26ce2d49911b4adb2ea557e893dfa5734b6e9253" diff --git a/src/run.py b/src/run.py index 049e01443..fabe0285b 100755 --- a/src/run.py +++ b/src/run.py @@ -35,9 +35,85 @@ REST_SERVICE_HOST_DEFAULT = "localhost" REST_SERVICE_PORT_DEFAULT = 8000 +# Maximum seconds to wait for an embedded REST service subprocess to become +# reachable after we spawn it. Empirically the Django dev server takes ~3s +# to bind on a typical dev machine. +EMBEDDED_REST_SERVICE_BOOT_TIMEOUT_SECONDS = 30.0 + + +def _is_rest_service_reachable(host: str, port, timeout: float = 0.5) -> bool: + """Return True if a TCP connection to host:port succeeds within timeout.""" + import socket + try: + with socket.create_connection((host, int(port)), timeout=timeout): + return True + except OSError: + return False + + +def _pick_free_localhost_port() -> int: + """Bind to port 0 on localhost to discover a free port, then release it.""" + import socket + s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + s.bind(("127.0.0.1", 0)) + port = s.getsockname()[1] + s.close() + return port + + +def _terminate_subprocess(proc): + """Best-effort terminate then kill of a subprocess.Popen handle.""" + try: + proc.terminate() + proc.wait(timeout=5.0) + except Exception: + try: + proc.kill() + except Exception: + pass + + +def _start_embedded_rest_service(port: int): + """Start the workspace builder Django dev server as a subprocess on the + given port and wait for it to be reachable. Returns the Popen handle. + + Used by the `simulate` subcommand to make itself self-contained when no + REST service is already running. The subprocess is terminated via + atexit so it cleans up even on fatal() / unexpected exits. + """ + import atexit + import subprocess + import sys + import time + + manage_py_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "manage.py") + proc = subprocess.Popen( + [sys.executable, manage_py_path, "runserver", f"127.0.0.1:{port}", "--noreload"], + stdout=subprocess.DEVNULL, + stderr=subprocess.DEVNULL, + ) + atexit.register(_terminate_subprocess, proc) + + deadline = time.time() + EMBEDDED_REST_SERVICE_BOOT_TIMEOUT_SECONDS + while time.time() < deadline: + if proc.poll() is not None: + raise RuntimeError( + f"Embedded REST service subprocess exited prematurely " + f"(rc={proc.returncode})" + ) + if _is_rest_service_reachable("127.0.0.1", port, timeout=0.5): + return proc + time.sleep(0.2) + _terminate_subprocess(proc) + raise RuntimeError( + f"Embedded REST service did not become reachable within " + f"{EMBEDDED_REST_SERVICE_BOOT_TIMEOUT_SECONDS} seconds" + ) + INFO_COMMAND = 'info' RUN_COMMAND = 'run' UPLOAD_COMMAND = 'upload' +SIMULATE_COMMAND = 'simulate' CUSTOMIZATION_RULES_DEFAULT = "map-customization-rules" @@ -199,12 +275,28 @@ def coalesce(*vals): return next((v for v in vals if v not in (None, "")), None) +def build_simulate_envelope(papi_response_data: dict, workspace_name: str) -> str: + """Construct the JSON envelope printed on stdout by the simulate subcommand + after a successful upload. Pure function — no I/O. + """ + envelope = { + "task_id": papi_response_data.get("task_id"), + "workspace_name": workspace_name, + } + return json.dumps(envelope) + + def main(): parser = ArgumentParser(description="Run onboarding script to generate initial workspace and SLX info") - parser.add_argument('command', action='store', choices=[INFO_COMMAND, RUN_COMMAND, UPLOAD_COMMAND], + parser.add_argument('command', action='store', + choices=[INFO_COMMAND, RUN_COMMAND, UPLOAD_COMMAND, SIMULATE_COMMAND], help=f'{SERVICE_NAME} action to perform. ' '"info" returns info about the available components. ' - '"run" runs the {SERVICE_NAME} components to generate workspace/SLX files.') + '"run" runs the {SERVICE_NAME} components to generate workspace/SLX files. ' + '"upload" uploads previously generated workspace data to PAPI. ' + '"simulate" drives the simulator pipeline end-to-end against a YAML test ' + 'config (uses the bundled simulator codecollection); add --upload to ' + 'POST the resulting archive to PAPI.') parser.add_argument('-v', '--verbose', action='store_true', help="Print more detailed output.") parser.add_argument('-k', '--kubeconfig', action='store', dest='kubeconfig', default='kubeconfig', @@ -221,6 +313,9 @@ def main(): parser.add_argument('--upload-info', action='store', dest="upload_info", default="uploadInfo.yaml", help="Location of the uploadInfo.yaml file that was downloaded from the RunWhen GUI " "after creating a new workspace. The path is relative to the base directory.") + parser.add_argument('--config', action='store', dest='simulate_config', + help="Path to the YAML test config file (simulate subcommand only). " + "Path is relative to the base directory if not absolute.") parser.add_argument('--rest-service-host', action='store', dest="rest_service_host", default="localhost:8000", help=f'Host/port info for where the {SERVICE_NAME} REST service is running. ' @@ -284,6 +379,35 @@ def main(): rest_service_host = REST_SERVICE_HOST_DEFAULT rest_service_port = REST_SERVICE_PORT_DEFAULT + # When invoked as `simulate`, auto-start an embedded REST service + # subprocess if (a) the user didn't explicitly point at an existing one + # via --rest-service-host, and (b) nothing is already listening on the + # default host:port (e.g. a Django runserver in another terminal, or a + # runwhen-local container's bundled service). This makes `simulate` + # self-contained for standalone consumers without disturbing the + # existing run/upload flows. + # The --rest-service-host argument has a string default ("localhost:8000"), + # so args.rest_service_host is always truthy. Treat the default as "user + # did not explicitly point at a service"; only skip embedding when the + # user supplied a custom value. + user_supplied_rest_host = ( + args.rest_service_host + and args.rest_service_host != f"{REST_SERVICE_HOST_DEFAULT}:{REST_SERVICE_PORT_DEFAULT}" + ) + if args.command == SIMULATE_COMMAND and not user_supplied_rest_host: + if not _is_rest_service_reachable(rest_service_host, rest_service_port, timeout=0.5): + embedded_port = _pick_free_localhost_port() + print( + f"No REST service detected at {rest_service_host}:{rest_service_port}; " + f"starting embedded workspace-builder service on localhost:{embedded_port}" + ) + # Bind on 127.0.0.1 (loopback only) but route subsequent client + # requests through "localhost" so Django's ALLOWED_HOSTS=["localhost"] + # accepts them. Both resolve to the same socket on the same host. + _start_embedded_rest_service(embedded_port) + rest_service_host = "localhost" + rest_service_port = embedded_port + if args.command == INFO_COMMAND: info_url = f"http://{rest_service_host}:{rest_service_port}/info/" response = call_rest_service_with_retries(lambda: requests.get(info_url)) @@ -450,7 +574,13 @@ def main(): ) if cloud_config is None: - raise ValueError("'cloudConfig' section is missing; discovery configuration must be explicit.") + if args.command == SIMULATE_COMMAND: + # The simulator doesn't do live discovery, so it doesn't need a real + # cloudConfig. Default to an empty dict so the rest of the pipeline + # treats discovery as a no-op. + cloud_config = {} + else: + raise ValueError("'cloudConfig' section is missing; discovery configuration must be explicit.") azure_config = None @@ -619,6 +749,40 @@ def main(): output_path = os.path.join(base_directory, args.output_path) status_file_path = os.path.join(output_path, ".status") + _original_command = args.command + _SIMULATE_REQUEST_OVERRIDES = {} + _simulator_temp_dir = None + if args.command == SIMULATE_COMMAND: + if not args.simulate_config: + fatal("simulate requires --config ") + config_path = args.simulate_config + if not os.path.isabs(config_path): + config_path = os.path.join(base_directory, config_path) + if not os.path.exists(config_path): + fatal(f"Test config file not found: {config_path}") + test_config_text = read_file(config_path, "r") + + # Materialize the bundled simulator codecollection as a temp git repo + # because the codecollection loader requires git.Repo.clone_from(...). + from utils import materialize_simulator_codecollection_repo + simulator_cc_dir = os.path.join( + os.path.dirname(os.path.abspath(__file__)), "simulator-codecollection" + ) + _simulator_temp_dir = tempfile.mkdtemp(prefix="wb-simulator-") + simulator_repo_path = materialize_simulator_codecollection_repo( + simulator_cc_dir, _simulator_temp_dir + ) + + args.components = "test_synth,generation_rules,test_groups,render_output_items" + # args.upload_data already reflects whether --upload was passed; preserve it. + args.command = RUN_COMMAND + _SIMULATE_REQUEST_OVERRIDES = { + "testConfig": test_config_text, + "codeCollections": [ + {"repoURL": simulator_repo_path, "ref": "main"}, + ], + } + if args.command == RUN_COMMAND: request_data = dict() @@ -651,6 +815,7 @@ def main(): if not args.components: fatal("Error: at least one component must be specified to run") request_data['components'] = args.components + request_data.update(_SIMULATE_REQUEST_OVERRIDES) if final_kubeconfig_path and os.path.exists(final_kubeconfig_path): print(f"Indexing Kubernetes with kubeconfig from {final_kubeconfig_path}") kubeconfig_data = read_file(final_kubeconfig_path, "rb") @@ -806,7 +971,7 @@ def main(): try: response = requests.post(upload_url, data=upload_request_text, headers=headers, verify=get_request_verify()) except requests.exceptions.ConnectionError as e: - fatal("Upload of map builder data failed, because the PAPI upload URL is invalid or unavailable; {e}") + fatal(f"Upload of map builder data failed, because the PAPI upload URL is invalid or unavailable; {e}") # NB: The fatal call will already have exited, so this return call isn't really # necessary, but it makes the linting code happy so that it doesn't complain # about possibly uninitialized variables, e.g. response @@ -839,6 +1004,17 @@ def main(): print("Workspace builder data uploaded successfully.") + if _original_command == SIMULATE_COMMAND: + try: + response_data = response.json() + except (ValueError, requests.exceptions.JSONDecodeError): + response_data = {} + print(build_simulate_envelope(response_data, workspace_name)) + + # Cleanup the temp git repo materialized for the simulate subcommand. + if _simulator_temp_dir: + shutil.rmtree(_simulator_temp_dir, ignore_errors=True) + # Add cheat-sheet integration, which points at the output items and # generates the list of local commands that exist in the TaskSet. # FIXME: I think it would probably be cleaner and more decoupled to move the diff --git a/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/generation-rules/passthrough.yaml b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/generation-rules/passthrough.yaml new file mode 100644 index 000000000..109586fdb --- /dev/null +++ b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/generation-rules/passthrough.yaml @@ -0,0 +1,38 @@ +# Passthrough generation rule for the simulator. Matches synthesized +# resources under the kubernetes platform with type deployment, and emits +# one SLX per resource with three output items: runbook, sli, slo. +# +# Conditional rendering: the rule unconditionally schedules all three +# output items, but the SLI/SLO templates raise a Jinja error (and are +# therefore skipped by render_output_items) when match_resource.sli or +# match_resource.slo is empty/None. Only runbook is always rendered. +# +# We don't gate via separate generation rules because collect_emitted_slxs +# de-duplicates SLXs by full_name. Two rules emitting an SLX with the same +# baseName + qualifiers collide and only the first set of output items +# is retained, so per-rule predicate gating wouldn't accumulate output +# items the way the simulator needs. +apiVersion: runwhen.com/v1 +kind: GenerationRules +spec: + platform: kubernetes + generationRules: + - resourceTypes: + - deployment + matchRules: + - type: pattern + pattern: ".+" + properties: ["name"] + mode: substring + slxs: + - baseName: "test" + qualifiers: ["resource"] + outputItems: + - type: slx + templateName: test-slx.yaml + - type: runbook + templateName: test-runbook.yaml + - type: sli + templateName: test-sli.yaml + - type: slo + templateName: test-slo.yaml diff --git a/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-runbook.yaml b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-runbook.yaml new file mode 100644 index 000000000..3b90844bd --- /dev/null +++ b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-runbook.yaml @@ -0,0 +1,21 @@ +apiVersion: runwhen.com/v1 +kind: Runbook +metadata: + name: {{ slx_name }} + labels: +{% include "common-labels.yaml" %} + annotations: + fullSlxName: {{ full_slx_name }} + sourceGenerationRuleRepoURL: {{ repo_url }} + sourceGenerationRuleRepoRef: {{ ref }} + sourceGenerationRulePath: {{ generation_rule_file_path }} + qualifiers: '{{ qualifiers | tojson }}' + internal.runwhen.com/generated-by: {{ generated_by }} +spec: + location: {{ location_id }} + codeBundle: + repoUrl: {{ match_resource.repo_url }} + ref: {{ match_resource.ref }} + pathToRobot: {{ match_resource.runbook.get("pathToRobot", "") }} + configProvided: {{ match_resource.runbook.get("configProvided", []) | tojson }} + secretsProvided: {{ match_resource.runbook.get("secretsProvided", []) | tojson }} diff --git a/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-sli.yaml b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-sli.yaml new file mode 100644 index 000000000..4d73f7b16 --- /dev/null +++ b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-sli.yaml @@ -0,0 +1,35 @@ +{# Skip rendering when the test config didn't supply an "sli" subdict for + this SLX. Trigger a deliberate Jinja error so render_output_items + classifies this template as skipped (and the file is never written). #} +{% if not match_resource.sli %}{% set _skip = (1 / 0) %}{% endif %} +apiVersion: runwhen.com/v1 +kind: ServiceLevelIndicator +metadata: + name: {{ slx_name }} + labels: +{% include "common-labels.yaml" %} + annotations: + fullSlxName: {{ full_slx_name }} + sourceGenerationRuleRepoURL: {{ repo_url }} + sourceGenerationRuleRepoRef: {{ ref }} + sourceGenerationRulePath: {{ generation_rule_file_path }} + qualifiers: '{{ qualifiers | tojson }}' + internal.runwhen.com/generated-by: {{ generated_by }} +spec: + location: {{ location_id }} + locations: + - {{ location_id }} + codeBundle: + repoUrl: {{ match_resource.repo_url }} + ref: {{ match_resource.ref }} + pathToRobot: {{ match_resource.sli.get("pathToRobot", "") }} + description: {{ match_resource.sli.get("description", "") | tojson }} + displayUnitsLong: {{ match_resource.sli.get("displayUnitsLong", "") | tojson }} + displayUnitsShort: {{ match_resource.sli.get("displayUnitsShort", "") | tojson }} + intervalStrategy: {{ match_resource.sli.get("intervalStrategy", "intermezzo") | tojson }} + intervalSeconds: {{ match_resource.sli.get("intervalSeconds", 60) }} + configProvided: {{ match_resource.sli.get("configProvided", []) | tojson }} + secretsProvided: {{ match_resource.sli.get("secretsProvided", []) | tojson }} +{% if match_resource.sli.get("alertConfig") %} + alertConfig: {{ match_resource.sli.get("alertConfig") | tojson }} +{% endif %} diff --git a/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-slo.yaml b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-slo.yaml new file mode 100644 index 000000000..c0dab9c3e --- /dev/null +++ b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-slo.yaml @@ -0,0 +1,26 @@ +{# Skip rendering when the test config didn't supply an "slo" subdict for + this SLX. Trigger a deliberate Jinja error so render_output_items + classifies this template as skipped (and the file is never written). #} +{% if not match_resource.slo %}{% set _skip = (1 / 0) %}{% endif %} +apiVersion: runwhen.com/v1 +kind: ServiceLevelObjective +metadata: + name: {{ slx_name }} + labels: +{% include "common-labels.yaml" %} + annotations: + fullSlxName: {{ full_slx_name }} + sourceGenerationRuleRepoURL: {{ repo_url }} + sourceGenerationRuleRepoRef: {{ ref }} + sourceGenerationRulePath: {{ generation_rule_file_path }} + qualifiers: '{{ qualifiers | tojson }}' + internal.runwhen.com/generated-by: {{ generated_by }} +spec: + location: {{ location_id }} + codeBundle: + repoUrl: {{ match_resource.repo_url }} + ref: {{ match_resource.ref }} + pathToRobot: {{ match_resource.slo.get("pathToRobot", "") }} + target: {{ match_resource.slo.get("target", 99.0) }} + configProvided: {{ match_resource.slo.get("configProvided", []) | tojson }} + secretsProvided: {{ match_resource.slo.get("secretsProvided", []) | tojson }} diff --git a/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-slx.yaml b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-slx.yaml new file mode 100644 index 000000000..554717eea --- /dev/null +++ b/src/simulator-codecollection/codebundles/simulator-passthrough/.runwhen/templates/test-slx.yaml @@ -0,0 +1,22 @@ +apiVersion: runwhen.com/v1 +kind: ServiceLevelX +metadata: + name: {{ slx_name }} + labels: +{% include "common-labels.yaml" %} + annotations: + fullSlxName: {{ full_slx_name }} + sourceGenerationRuleRepoURL: {{ repo_url }} + sourceGenerationRuleRepoRef: {{ ref }} + sourceGenerationRulePath: {{ generation_rule_file_path }} + qualifiers: '{{ qualifiers | tojson }}' + internal.runwhen.com/generated-by: {{ generated_by }} +spec: + alias: {{ match_resource.alias | tojson }} + asMeasuredBy: {{ match_resource.as_measured_by | tojson }} + imageURL: {{ match_resource.image_url | tojson }} + statement: {{ match_resource.statement | tojson }} + owners: {{ match_resource.owners | tojson }} + configProvided: {{ match_resource.config_provided | tojson }} + tags: {{ match_resource.tags | tojson }} + additionalContext: {{ match_resource.additional_context | tojson }} diff --git a/src/utils.py b/src/utils.py index d370b90b6..aa6c10720 100644 --- a/src/utils.py +++ b/src/utils.py @@ -222,6 +222,22 @@ def get_request_verify(): return False return None +def materialize_simulator_codecollection_repo(source_dir: str, target_dir: str) -> str: + """Copy a simulator codecollection directory into target_dir and `git init` it + so the codecollection loader (which uses git.Repo.clone_from) can consume it. + Returns the path to the created repo directory.""" + import shutil + import git + repo_path = os.path.join(target_dir, "simulator-codecollection") + if os.path.exists(repo_path): + shutil.rmtree(repo_path) + shutil.copytree(source_dir, repo_path) + repo = git.Repo.init(repo_path, initial_branch="main") + repo.git.add(all=True) + repo.index.commit("Initial commit") + return repo_path + + def mask_string(input_string: str, start_visible: int = 2, end_visible: int = 2, mask_char: str = '*') -> str: """ Mask a string by keeping the first and last few characters visible and replacing diff --git a/src/workspace_builder/tests_simulator.py b/src/workspace_builder/tests_simulator.py new file mode 100644 index 000000000..e96e28aa8 --- /dev/null +++ b/src/workspace_builder/tests_simulator.py @@ -0,0 +1,589 @@ +import json +import os +import tempfile +from http import HTTPStatus + +from django.test import TestCase + + +class SimulatorTestCase(TestCase): + def test_test_synth_runs_as_noop(self): + """test_synth alone should run without error and produce no SLXs.""" + request_data = { + "components": "test_synth", + "workspaceName": "ws-noop", + "papiURL": "http://papi.local", + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + + +TEST_CONFIG_YAML_BASIC = """ +slxs: + my-app-ops: + levelOfDetail: detailed + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + repoURL: https://github.com/runwhen-contrib/rw-cli-codecollection.git + ref: main + alias: My App Operations + statement: Operational tasks for my-app + runbook: + pathToRobot: codebundles/k8s-deployment-ops/runbook.robot + configProvided: + - {name: NAMESPACE, value: demo} + secretsProvided: + - {name: kubeconfig, workspaceKey: kubeconfig} +""" + + +class TestSynthSynthesisTestCase(TestCase): + def test_synthesizes_one_resource_per_slx(self): + """test_synth should register one resource per slxs entry.""" + request_data = { + "components": "test_synth,dump_resources", + "workspaceName": "ws-synth", + "papiURL": "http://papi.local", + "testConfig": TEST_CONFIG_YAML_BASIC, + "resourceDumpPath": "resource_dump.yaml", + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + # dump_resources writes the resource registry into the response archive. + # Decode and assert that exactly one resource named "my-app-ops" exists, + # registered under the kubernetes platform. + from base64 import b64decode + import io, tarfile, yaml + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + dump_member = next(m for m in archive.getmembers() if m.name.endswith("resource_dump.yaml")) + dump_text = archive.extractfile(dump_member).read().decode("utf-8") + # The YAML uses custom tags (!Registry, !ResourceType); just assert the + # expected SLX slug appears in the dump as a resource name. + self.assertIn("my-app-ops", dump_text) + self.assertIn("kubernetes", dump_text) # platform name + + +# Path to the synthetic codecollection bundled with runwhen-local for the simulator. +SIMULATOR_CODECOLLECTION_PATH = os.path.abspath( + os.path.join(os.path.dirname(__file__), "..", "simulator-codecollection") +) + + +def _materialize_simulator_codecollection_repo(target_dir: str) -> str: + """Thin wrapper around the shared helper for backwards compatibility within + this test module.""" + from utils import materialize_simulator_codecollection_repo + return materialize_simulator_codecollection_repo(SIMULATOR_CODECOLLECTION_PATH, target_dir) + + +class PassthroughGenerationRuleTestCase(TestCase): + def setUp(self): + self.tmp = tempfile.TemporaryDirectory() + self.codecollection_repo_path = _materialize_simulator_codecollection_repo(self.tmp.name) + + def tearDown(self): + self.tmp.cleanup() + + def test_passthrough_rule_produces_one_slx_per_resource(self): + request_data = { + "components": "test_synth,generation_rules,render_output_items", + "workspaceName": "ws-rule", + "workspaceOwnerEmail": "test@example.com", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": TEST_CONFIG_YAML_BASIC, + "codeCollections": [ + {"repoURL": self.codecollection_repo_path, "ref": "main"} + ], + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + + # Inspect the response archive: assert one SLX directory was produced + # by the passthrough rule. The exact SLX dir name depends on baseName + + # qualifier resolution; don't hard-code it -- just assert exactly one + # SLX dir is present. + # + # Note: in this task we don't ship a runbook template yet, so the + # output items inside the SLX dir all get skipped and no files are + # actually written into slxs//. We therefore look for SLX dir + # paths in two places: + # 1. archive members (any file under workspaces//slxs//) + # 2. the skipped templates report, which records the would-be paths + from base64 import b64decode + import io, re, tarfile + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + all_members = [m.name for m in archive.getmembers()] + slx_dirs: set[str] = set() + for m in archive.getmembers(): + if "/slxs/" in m.name and not m.isdir(): + slx_dirs.add(m.name.split("/slxs/", 1)[1].split("/", 1)[0]) + report_member = next( + (m for m in archive.getmembers() if m.name.endswith("skipped_templates_report.md")), + None, + ) + if report_member is not None: + report_text = archive.extractfile(report_member).read().decode("utf-8") + for path_match in re.finditer(r"/slxs/([^/\s]+)/", report_text): + slx_dirs.add(path_match.group(1)) + self.assertEqual( + len(slx_dirs), 1, + f"Expected exactly 1 SLX dir, got: {slx_dirs}; archive members: {all_members}" + ) + + +class SimulatorRunbookRenderTestCase(TestCase): + def setUp(self): + self.tmp = tempfile.TemporaryDirectory() + self.codecollection_repo_path = _materialize_simulator_codecollection_repo(self.tmp.name) + + def tearDown(self): + self.tmp.cleanup() + + def test_runbook_yaml_is_rendered_with_expected_labels_and_spec(self): + request_data = { + "components": "test_synth,generation_rules,render_output_items", + "workspaceName": "ws-render", + "workspaceOwnerEmail": "test@example.com", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": TEST_CONFIG_YAML_BASIC, + "codeCollections": [ + {"repoURL": self.codecollection_repo_path, "ref": "main"}, + ], + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + + from base64 import b64decode + import io, tarfile, yaml + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + + # Find the rendered runbook.yaml. SLX dir name comes from + # baseName + qualifier resolution and is not hard-coded. + runbook_members = [ + m for m in archive.getmembers() + if m.name.endswith("/runbook.yaml") and "/slxs/" in m.name + ] + all_members = [m.name for m in archive.getmembers()] + self.assertEqual( + len(runbook_members), 1, + f"Expected exactly 1 runbook.yaml; got {len(runbook_members)}. Members: {all_members}", + ) + + runbook_text = archive.extractfile(runbook_members[0]).read().decode("utf-8") + runbook_doc = yaml.safe_load(runbook_text) + + self.assertEqual(runbook_doc["kind"], "Runbook") + # metadata.name has the workspace-prefixed format produced by real runs. + self.assertTrue( + runbook_doc["metadata"]["name"].startswith("ws-render--"), + f"Expected metadata.name to start with 'ws-render--'; got {runbook_doc['metadata']['name']}", + ) + # Standard labels from common-labels.yaml should be present. + labels = runbook_doc["metadata"]["labels"] + self.assertEqual(labels["workspace"], "ws-render") + self.assertEqual(labels["locationId"], "loc-1") + # spec.codeBundle assembled from test config repoURL/ref + runbook.pathToRobot. + spec = runbook_doc["spec"] + self.assertEqual(spec["codeBundle"]["repoUrl"], + "https://github.com/runwhen-contrib/rw-cli-codecollection.git") + self.assertEqual(spec["codeBundle"]["ref"], "main") + self.assertEqual(spec["codeBundle"]["pathToRobot"], + "codebundles/k8s-deployment-ops/runbook.robot") + # configProvided/secretsProvided pass through from the test config. + self.assertEqual(spec["configProvided"], [{"name": "NAMESPACE", "value": "demo"}]) + self.assertEqual(spec["secretsProvided"], + [{"name": "kubeconfig", "workspaceKey": "kubeconfig"}]) + + +TEST_CONFIG_YAML_RUNBOOK_ONLY = """ +slxs: + bare-slx: + levelOfDetail: basic + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + runbook: + commands: [] +""" + +TEST_CONFIG_YAML_FULL = """ +slxs: + full-slx: + levelOfDetail: detailed + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + runbook: { commands: [] } + sli: { threshold: 0.99 } + slo: { target: 99.5 } +""" + + +class ConditionalOutputItemTestCase(TestCase): + def setUp(self): + self.tmp = tempfile.TemporaryDirectory() + self.codecollection_repo_path = _materialize_simulator_codecollection_repo(self.tmp.name) + + def tearDown(self): + self.tmp.cleanup() + + def _post(self, test_config_yaml: str, workspace_name: str) -> set[str]: + request_data = { + "components": "test_synth,generation_rules,render_output_items", + "workspaceName": workspace_name, + "workspaceOwnerEmail": "test@example.com", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": test_config_yaml, + "codeCollections": [ + {"repoURL": self.codecollection_repo_path, "ref": "main"}, + ], + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + from base64 import b64decode + import io, tarfile + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + # Group filenames by basename in the SLX dir. + by_basename: set[str] = set() + for m in archive.getmembers(): + if "/slxs/" in m.name and not m.isdir(): + by_basename.add(os.path.basename(m.name)) + return by_basename + + def test_runbook_only_omits_sli_and_slo(self): + files = self._post(TEST_CONFIG_YAML_RUNBOOK_ONLY, "ws-bare") + self.assertIn("runbook.yaml", files) + self.assertNotIn("sli.yaml", files) + self.assertNotIn("slo.yaml", files) + + def test_full_slx_renders_runbook_sli_slo(self): + files = self._post(TEST_CONFIG_YAML_FULL, "ws-full") + self.assertIn("runbook.yaml", files) + self.assertIn("sli.yaml", files) + self.assertIn("slo.yaml", files) + + +# ---------------------------------------------------------------------- +# Inventory + cardinality + groups + relationships tests +# ---------------------------------------------------------------------- + +INVENTORY_TEST_CONFIG = """ +inventory: + clusters: + - name: prod-cluster + namespaces: + - name: prod-app + - name: prod-cache + - name: edge-cluster + namespaces: + - name: ingress + + resources: + - id: deploy-app + kind: Deployment + name: my-app + cluster: prod-cluster + namespace: prod-app + labels: + app.kubernetes.io/name: my-app + tier: backend + - id: sts-cache + kind: StatefulSet + name: my-cache + cluster: prod-cluster + namespace: prod-cache + - id: ingress-public + kind: Ingress + name: public-api + cluster: edge-cluster + namespace: ingress + +slxGroups: + - name: My App + slxs: [app-ops, app-health] + - name: Edge Ingress + slxs: [ingress-health] + dependsOn: [My App] + +slxRelationships: + - subject: ingress-health + verb: dependent-on + object: app-health + +slxs: + app-ops: + levelOfDetail: detailed + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-ops + repoURL: https://github.com/runwhen-contrib/rw-cli-codecollection.git + ref: main + resources: [deploy-app] + runbook: + pathToRobot: codebundles/k8s-deployment-ops/runbook.robot + + app-health: + levelOfDetail: detailed + codeCollection: rw-cli-codecollection + codeBundle: k8s-deployment-healthcheck + repoURL: https://github.com/runwhen-contrib/rw-cli-codecollection.git + ref: main + resources: [deploy-app] # 1:N — same resource, different SLX + runbook: + pathToRobot: codebundles/k8s-deployment-healthcheck/runbook.robot + sli: + pathToRobot: codebundles/k8s-deployment-healthcheck/sli.robot + + ingress-health: + levelOfDetail: basic + codeCollection: rw-cli-codecollection + codeBundle: k8s-ingress-healthcheck + repoURL: https://github.com/runwhen-contrib/rw-cli-codecollection.git + ref: main + resources: [ingress-public] + runbook: + pathToRobot: codebundles/k8s-ingress-healthcheck/runbook.robot + + prod-aggregate: + levelOfDetail: basic + codeCollection: rw-cli-codecollection + codeBundle: k8s-aggregate-health + repoURL: https://github.com/runwhen-contrib/rw-cli-codecollection.git + ref: main + resources: [deploy-app, sts-cache] # N:1 — multiple resources, one SLX + runbook: + pathToRobot: codebundles/k8s-aggregate-health/runbook.robot +""" + + +class InventoryAndGroupsTestCase(TestCase): + @classmethod + def setUpClass(cls): + super().setUpClass() + cls._tmp = tempfile.TemporaryDirectory() + cls.codecollection_repo_path = _materialize_simulator_codecollection_repo(cls._tmp.name) + + request_data = { + "components": "test_synth,generation_rules,test_groups,render_output_items", + "workspaceName": "ws-inv", + "workspaceOwnerEmail": "test@example.com", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": INVENTORY_TEST_CONFIG, + "codeCollections": [ + {"repoURL": cls.codecollection_repo_path, "ref": "main"}, + ], + } + response = cls.client_class().post( + "/run/", data=request_data, content_type="application/json" + ) + if response.status_code != HTTPStatus.OK: + cls._tmp.cleanup() + raise AssertionError( + f"/run/ failed: {response.status_code} {response.content[:500]}" + ) + + from base64 import b64decode + import io, tarfile + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + cls._archive_members: dict[str, bytes] = {} + for m in archive.getmembers(): + if not m.isdir(): + cls._archive_members[m.name] = archive.extractfile(m).read() + + @classmethod + def tearDownClass(cls): + cls._tmp.cleanup() + super().tearDownClass() + + def _read_yaml(self, path: str) -> dict: + import yaml + for member_name, content in self._archive_members.items(): + if member_name.endswith(path): + return yaml.safe_load(content.decode("utf-8")) + raise AssertionError( + f"No archive member ending in {path}; members: {sorted(self._archive_members.keys())}" + ) + + def _slx_dirs_with_basename(self, basename: str) -> list[str]: + out = [] + for member_name in self._archive_members: + if member_name.endswith(f"/{basename}") and "/slxs/" in member_name: + out.append(member_name.split("/slxs/", 1)[1].split("/", 1)[0]) + return out + + def test_four_slxs_produced_from_test_config(self): + slx_dirs = self._slx_dirs_with_basename("slx.yaml") + self.assertEqual(len(slx_dirs), 4, + f"Expected 4 SLX dirs, got {len(slx_dirs)}: {slx_dirs}") + + def test_kind_and_cluster_tags_auto_derived_from_inventory(self): + # Find the ingress SLX (its kind tag should be Ingress). + for member_name, content in self._archive_members.items(): + if member_name.endswith("/slx.yaml") and "/slxs/" in member_name: + import yaml + doc = yaml.safe_load(content.decode("utf-8")) + tags = {t["name"]: t["value"] for t in doc["spec"]["tags"]} + if tags.get("kind") == "Ingress": + self.assertEqual(tags["cluster"], "edge-cluster") + self.assertEqual(tags["namespace"], "ingress") + self.assertEqual(tags["resource_name"], "public-api") + self.assertEqual(tags["resource_type"], "ingress") + self.assertEqual(tags["platform"], "kubernetes") + return + self.fail("No SLX with kind=Ingress found in archive") + + def test_k8s_labels_become_prefixed_tags(self): + # The deploy-app inventory resource has app.kubernetes.io/name and tier + # labels; these should appear as [k8s] tags on its SLXs. + found = False + for member_name, content in self._archive_members.items(): + if member_name.endswith("/slx.yaml") and "/slxs/" in member_name: + import yaml + doc = yaml.safe_load(content.decode("utf-8")) + tag_dict = {t["name"]: t["value"] for t in doc["spec"]["tags"]} + if tag_dict.get("kind") == "Deployment" and tag_dict.get("resource_name") == "my-app": + # Note: the same resource backs two SLXs (1:N), so we'll + # match the first one we find. + self.assertEqual(tag_dict.get("[k8s]app.kubernetes.io/name"), "my-app") + self.assertEqual(tag_dict.get("[k8s]tier"), "backend") + found = True + break + self.assertTrue(found, "No Deployment SLX with my-app name found") + + def test_n_to_1_aggregate_slx_has_child_resources(self): + # The prod-aggregate SLX is bound to two inventory resources; + # the second (sts-cache) should appear in additionalContext.childResources. + for member_name, content in self._archive_members.items(): + if member_name.endswith("/slx.yaml") and "/slxs/" in member_name: + import yaml + doc = yaml.safe_load(content.decode("utf-8")) + ctx = doc["spec"].get("additionalContext", {}) + children = ctx.get("childResources") + if children: + self.assertEqual(len(children), 1) + self.assertEqual(children[0]["kind"], "StatefulSet") + self.assertEqual(children[0]["name"], "my-cache") + return + self.fail("No SLX with childResources found") + + def test_workspace_yaml_includes_slx_groups(self): + ws = self._read_yaml("workspaces/ws-inv/workspace.yaml") + groups = {g["name"]: g for g in ws["spec"].get("slxGroups", [])} + self.assertIn("My App", groups) + self.assertIn("Edge Ingress", groups) + # My App group should reference both app-ops and app-health (full SLX names). + my_app_slxs = groups["My App"]["slxs"] + self.assertEqual(len(my_app_slxs), 2) + for slx_name in my_app_slxs: + self.assertTrue(slx_name.startswith("ws-inv--"), + f"SLX name in group should be workspace-prefixed: {slx_name}") + + def test_workspace_yaml_includes_slx_relationships(self): + ws = self._read_yaml("workspaces/ws-inv/workspace.yaml") + relationships = ws["spec"].get("slxRelationships", []) + self.assertEqual(len(relationships), 1) + rel = relationships[0] + # Both subject and object resolve to workspace-prefixed full SLX names. + self.assertTrue(rel["subject"].startswith("ws-inv--")) + self.assertTrue(rel["directObject"].startswith("ws-inv--")) + + +# ---------------------------------------------------------------------- +# Defaults inheritance + auto-derived pathToRobot +# ---------------------------------------------------------------------- + +DEFAULTS_TEST_CONFIG = """ +defaults: + repoURL: https://github.com/runwhen-contrib/rw-cli-codecollection.git + ref: main + codeCollection: rw-cli-codecollection + runbook: + secretsProvided: + - {name: kubeconfig, workspaceKey: kubeconfig} + +slxs: + brief-slx: + codeBundle: k8s-deployment-ops + runbook: + configProvided: + - {name: NAMESPACE, value: brief} +""" + + +class DefaultsAndAutoDeriveTestCase(TestCase): + def setUp(self): + self.tmp = tempfile.TemporaryDirectory() + self.codecollection_repo_path = _materialize_simulator_codecollection_repo(self.tmp.name) + + def tearDown(self): + self.tmp.cleanup() + + def _post_and_load_runbook(self) -> dict: + request_data = { + "components": "test_synth,generation_rules,test_groups,render_output_items", + "workspaceName": "ws-defaults", + "workspaceOwnerEmail": "test@example.com", + "papiURL": "http://papi.local", + "locationId": "loc-1", + "testConfig": DEFAULTS_TEST_CONFIG, + "codeCollections": [ + {"repoURL": self.codecollection_repo_path, "ref": "main"}, + ], + } + response = self.client.post( + "/run/", data=request_data, content_type="application/json" + ) + self.assertEqual(HTTPStatus.OK, response.status_code) + from base64 import b64decode + import io, tarfile, yaml + archive_bytes = b64decode(json.loads(response.content)["output"]) + archive = tarfile.open(fileobj=io.BytesIO(archive_bytes), mode="r") + runbook_member = next( + m for m in archive.getmembers() + if m.name.endswith("/runbook.yaml") and "/slxs/" in m.name + ) + return yaml.safe_load(archive.extractfile(runbook_member).read().decode("utf-8")) + + def test_defaults_supply_repoURL_ref_and_codeCollection(self): + runbook = self._post_and_load_runbook() + self.assertEqual( + runbook["spec"]["codeBundle"]["repoUrl"], + "https://github.com/runwhen-contrib/rw-cli-codecollection.git", + ) + self.assertEqual(runbook["spec"]["codeBundle"]["ref"], "main") + + def test_pathToRobot_auto_derived_from_codeBundle(self): + runbook = self._post_and_load_runbook() + self.assertEqual( + runbook["spec"]["codeBundle"]["pathToRobot"], + "codebundles/k8s-deployment-ops/runbook.robot", + ) + + def test_runbook_subdict_merges_defaults_secretsProvided_and_slx_configProvided(self): + runbook = self._post_and_load_runbook() + # secretsProvided inherited from defaults + self.assertEqual( + runbook["spec"]["secretsProvided"], + [{"name": "kubeconfig", "workspaceKey": "kubeconfig"}], + ) + # configProvided supplied per-SLX + self.assertEqual( + runbook["spec"]["configProvided"], + [{"name": "NAMESPACE", "value": "brief"}], + ) diff --git a/src/workspace_builder/tests_simulator_cli.py b/src/workspace_builder/tests_simulator_cli.py new file mode 100644 index 000000000..45c32e40a --- /dev/null +++ b/src/workspace_builder/tests_simulator_cli.py @@ -0,0 +1,77 @@ +import json +import os +import subprocess +import tempfile + +import yaml +from unittest import TestCase + + +UPLOAD_INFO = { + "papiURL": "http://papi.local", + "workspaceName": "ws-cli", + "locationId": "loc-1", + "locationName": "loc-1", + "workspaceOwnerEmail": "user@example.com", + "token": "fake-token", +} + +TEST_CONFIG = { + "slxs": { + "cli-slx": { + "levelOfDetail": "basic", + "codeCollection": "rw-cli-codecollection", + "codeBundle": "k8s-deployment-ops", + "runbook": {"commands": []}, + } + } +} + + +class SimulateCliTestCase(TestCase): + def test_simulate_subcommand_is_accepted_and_attempts_run_endpoint(self): + """argparse must accept 'simulate' and the CLI must reach the /run/ POST attempt.""" + with tempfile.TemporaryDirectory() as tmpdir: + upload_info_path = os.path.join(tmpdir, "uploadInfo.yaml") + test_config_path = os.path.join(tmpdir, "test.yaml") + with open(upload_info_path, "w") as f: + yaml.safe_dump(UPLOAD_INFO, f) + with open(test_config_path, "w") as f: + yaml.safe_dump(TEST_CONFIG, f) + + # Force REST host to an unreachable address so we don't depend on + # the workspace builder service being up. The CLI should still + # parse args and attempt the POST. + env = os.environ.copy() + env["REST_SERVICE_HOST"] = "127.0.0.1" + env["REST_SERVICE_PORT"] = "1" + + src_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)), "..") + # Use the same Python interpreter as the test runner so we share + # the project's installed dependencies (requests, etc.). + import sys + result = subprocess.run( + [sys.executable, "run.py", "simulate", + "--config", test_config_path, + "--upload-info", upload_info_path, + "--base-directory", tmpdir, + "--rest-service-host", "127.0.0.1:1"], + cwd=src_dir, + env=env, + capture_output=True, + text=True, + timeout=120, + ) + # Exit non-zero is fine (the REST POST fails since service isn't up), + # but stderr/stdout must NOT mention "invalid choice" - that would + # indicate argparse rejected 'simulate'. + combined = (result.stdout + result.stderr).lower() + self.assertNotIn("invalid choice", combined, + f"argparse rejected 'simulate':\nstdout={result.stdout}\nstderr={result.stderr}") + # Confirm the CLI got far enough to attempt the REST call. + self.assertTrue( + "connection refused" in combined + or "rest service" in combined + or "newconnectionerror" in combined, + f"Expected REST call attempt:\nstdout={result.stdout}\nstderr={result.stderr}" + ) diff --git a/src/workspace_builder/tests_simulator_envelope.py b/src/workspace_builder/tests_simulator_envelope.py new file mode 100644 index 000000000..b55c6dc54 --- /dev/null +++ b/src/workspace_builder/tests_simulator_envelope.py @@ -0,0 +1,40 @@ +import json +from unittest import TestCase + +from run import build_simulate_envelope + + +class BuildSimulateEnvelopeTestCase(TestCase): + def test_envelope_contains_task_id_and_workspace_name(self): + papi_response = { + "task_id": "abc-123", + "status": "queued", + "message": "Upload queued", + } + line = build_simulate_envelope(papi_response, "ws-test") + parsed = json.loads(line) + self.assertEqual(parsed, {"task_id": "abc-123", "workspace_name": "ws-test"}) + + def test_envelope_handles_missing_task_id(self): + line = build_simulate_envelope({}, "ws-test") + parsed = json.loads(line) + self.assertEqual(parsed, {"task_id": None, "workspace_name": "ws-test"}) + + def test_envelope_is_single_line_json(self): + line = build_simulate_envelope({"task_id": "abc"}, "ws") + # Single line — no embedded newlines. + self.assertNotIn("\n", line) + # Parseable JSON. + json.loads(line) + + +class SimulateEnvelopeWiringTestCase(TestCase): + """Confirms the helper wiring point in run.py exists and produces the right shape.""" + + def test_envelope_helper_is_importable_from_run_module(self): + # If this import works, the helper is at the right top-level location. + from run import build_simulate_envelope, SIMULATE_COMMAND + self.assertEqual(SIMULATE_COMMAND, "simulate") + # And the helper produces stable shape: + line = build_simulate_envelope({"task_id": "x"}, "y") + self.assertEqual(json.loads(line), {"task_id": "x", "workspace_name": "y"})