-
Notifications
You must be signed in to change notification settings - Fork 15
appsec: add smoke tests for apm standalone #6181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
florentinl
wants to merge
6
commits into
main
Choose a base branch
from
florentin.labelle/APPSEC-60883/infra-less-system-tests
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
6 commits
Select commit
Hold shift + click to select a range
2267c39
appsec: add smoke tests for apm standalone
florentinl b630a1a
comment to fill things
florentinl 9c677ee
enable tests for the different tracers
florentinl b4b99d6
fix format
florentinl 13031a0
v1 trace payload support + small refactor
florentinl 4042a19
linter
florentinl File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,160 @@ | ||
| # Unless explicitly stated otherwise all files in this repository are licensed under the the Apache License Version 2.0. | ||
| # This product includes software developed at Datadog (https://www.datadoghq.com/). | ||
| # Copyright 2021 Datadog, Inc. | ||
|
|
||
| """AppSec smoke tests for the appsec_apm_standalone scenario.""" | ||
|
|
||
| from utils import features, interfaces, remote_config as rc, weblog, scenarios | ||
|
|
||
| SMOKE_RC_RULE_ID = "smoke-rc-0001" | ||
| SMOKE_RC_RULE_FILE: tuple[str, dict[str, object]] = ( | ||
| "datadog/2/ASM_DD/rules/config", | ||
| { | ||
| "version": "2.2", | ||
| "metadata": {"rules_version": "2.71.8182"}, | ||
| "rules": [ | ||
| { | ||
| "id": SMOKE_RC_RULE_ID, | ||
| "name": "Smoke RC rule", | ||
| "tags": { | ||
| "type": "attack_tool", | ||
| "category": "attack_attempt", | ||
| "confidence": "1", | ||
| }, | ||
| "conditions": [ | ||
| { | ||
| "parameters": { | ||
| "inputs": [ | ||
| { | ||
| "address": "server.request.headers.no_cookies", | ||
| "key_path": ["x-smoke-test"], | ||
| } | ||
| ], | ||
| "regex": "^rc-smoke$", | ||
| }, | ||
| "operator": "match_regex", | ||
| } | ||
| ], | ||
| "transformers": [], | ||
| } | ||
| ], | ||
| }, | ||
| ) | ||
|
|
||
|
|
||
| class AgentLevelSmokeTests: | ||
| def setup_lfi_smoke(self) -> None: | ||
| self.r = weblog.get("/rasp/lfi", params={"file": "../etc/passwd"}) | ||
|
|
||
| def test_lfi_smoke(self) -> None: | ||
| assert self.r.status_code == 200 | ||
|
|
||
| expected_rule = "rasp-930-100" | ||
| expected_params = { | ||
| "resource": {"address": "server.io.fs.file", "value": "../etc/passwd"}, | ||
| "params": {"address": "server.request.query", "value": "../etc/passwd"}, | ||
| } | ||
|
|
||
| for _, _, appsec_data in interfaces.agent.get_appsec_data(self.r): | ||
| for trigger in appsec_data.get("triggers", []): | ||
| if trigger.get("rule", {}).get("id") != expected_rule: | ||
| continue | ||
| for match in trigger.get("rule_matches", []): | ||
| for params in match.get("parameters", []): | ||
| if not isinstance(params, dict): | ||
| continue | ||
| if all( | ||
| isinstance(params.get(name), dict) | ||
| and params[name].get("address") == fields.get("address") | ||
| and ("value" not in fields or params[name].get("value") == fields["value"]) | ||
| for name, fields in expected_params.items() | ||
| ): | ||
| return | ||
|
|
||
| raise AssertionError(f"No RASP attack found for rule {expected_rule}") | ||
|
|
||
| def setup_api_security_smoke(self) -> None: | ||
| self.r = weblog.get("/waf") | ||
|
|
||
| def test_api_security_smoke(self) -> None: | ||
| assert any( | ||
| any(key.startswith("_dd.appsec.s.") for key in span.meta) for _, span in interfaces.agent.get_spans(self.r) | ||
| ) | ||
|
|
||
| def setup_telemetry_smoke(self) -> None: | ||
| weblog.get("/") | ||
| self.r = weblog.get("/waf", headers={"User-Agent": "Arachni/v1"}) | ||
|
|
||
| def test_telemetry_smoke(self) -> None: | ||
| telemetry_data = list(interfaces.agent.get_telemetry_data(flatten_message_batches=False)) | ||
|
|
||
| assert telemetry_data, "Agent should forward telemetry data from the library" | ||
|
|
||
| found_metrics = False | ||
| found_waf_metric = False | ||
|
|
||
| for data in telemetry_data: | ||
| content = data.get("request", {}).get("content", {}) | ||
| request_type = content.get("request_type") | ||
|
|
||
| if request_type == "generate-metrics": | ||
| found_metrics = True | ||
|
|
||
| payload = content.get("payload", {}) | ||
| series = payload.get("series", []) | ||
|
|
||
| for metric_series in series: | ||
| metric_name = metric_series.get("metric", "") | ||
| if metric_name.startswith("waf."): | ||
| found_waf_metric = True | ||
| break | ||
|
|
||
| if found_metrics and found_waf_metric: | ||
| break | ||
|
|
||
| assert found_metrics | ||
| assert found_waf_metric | ||
|
|
||
| def setup_remote_config_smoke(self) -> None: | ||
| self.config_state = rc.tracer_rc_state.reset().set_config(*SMOKE_RC_RULE_FILE).apply().state | ||
| self.r = weblog.get("/waf", headers={"X-Smoke-Test": "rc-smoke"}) | ||
|
|
||
| def test_remote_config_smoke(self) -> None: | ||
| assert self.config_state == rc.ApplyState.ACKNOWLEDGED, ( | ||
| f"Remote config should be acknowledged, got {self.config_state}" | ||
| ) | ||
|
|
||
| assert any( | ||
| span.meta.get("appsec.event") == "true" | ||
| and any( | ||
| trigger.get("rule", {}).get("id") == SMOKE_RC_RULE_ID for trigger in appsec_data.get("triggers", []) | ||
| ) | ||
| for _, span, appsec_data in interfaces.agent.get_appsec_data(self.r) | ||
| ), "Agent should forward AppSec events for the RC-updated rule" | ||
|
|
||
| def setup_attack_detection_smoke(self) -> None: | ||
| rc.tracer_rc_state.reset().apply() | ||
| self.r = weblog.get("/waf", headers={"User-Agent": "Arachni/v1"}) | ||
|
|
||
| def test_attack_detection_smoke(self) -> None: | ||
| found_attack = False | ||
| has_appsec_data = False | ||
|
|
||
| for _, span, appsec_data in interfaces.agent.get_appsec_data(self.r): | ||
| span_meta = span.get("meta", {}) or span.get("attributes", {}) | ||
| if span_meta.get("appsec.event") == "true": | ||
| found_attack = True | ||
|
|
||
| if appsec_data is not None: | ||
| has_appsec_data = True | ||
| break | ||
|
|
||
| assert found_attack, "Agent should forward detected attacks in span metadata" | ||
| assert has_appsec_data, "Agent spans should include AppSec payload (JSON or metastruct)" | ||
|
|
||
|
|
||
| @features.appsec_apm_standalone | ||
| @scenarios.appsec_apm_standalone | ||
| @scenarios.appsec_standalone_apm_standalone | ||
| class Test_AppSecAPMStandalone(AgentLevelSmokeTests): | ||
| pass |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -4,15 +4,19 @@ | |
|
|
||
| """Validate data flow between agent and backend""" | ||
|
|
||
| from collections.abc import Callable, Generator, Iterable | ||
| import base64 | ||
| import copy | ||
| import threading | ||
| from collections.abc import Callable, Generator, Iterable | ||
| from typing import Any | ||
|
|
||
| import msgpack | ||
|
|
||
| from utils.dd_types import DataDogAgentTrace, DataDogAgentSpan, AgentTraceFormat | ||
| from utils._logger import logger | ||
| from utils._weblog import HttpResponse | ||
| from utils.dd_types import AgentTraceFormat, DataDogAgentSpan, DataDogAgentTrace | ||
| from utils.interfaces._core import ProxyBasedInterfaceValidator | ||
| from utils.interfaces._misc_validators import HeadersPresenceValidator | ||
| from utils._weblog import HttpResponse | ||
|
|
||
|
|
||
| class AgentInterfaceValidator(ProxyBasedInterfaceValidator): | ||
|
|
@@ -26,6 +30,33 @@ def ingest_file(self, src_path: str): | |
| self.ready.set() | ||
| return super().ingest_file(src_path) | ||
|
|
||
| def get_appsec_data( | ||
| self, request: HttpResponse | ||
| ) -> Generator[tuple[dict[str, Any], DataDogAgentSpan, dict[str, Any]], Any, None]: | ||
| for data, span in self.get_spans(request): | ||
| json_payload = span.meta.get("_dd.appsec.json") | ||
| if json_payload is not None: | ||
| yield data, span, json_payload | ||
|
|
||
| legacy_metastruct = span.get("metaStruct", {}).get("appsec") | ||
| v1_metastruct = span.meta.get("appsec") | ||
| payload = legacy_metastruct or v1_metastruct | ||
|
|
||
| if isinstance(payload, str): | ||
| try: | ||
florentinl marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| b64_decoded = base64.b64decode(payload) | ||
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Could you move this deserialization inside the proxy? ping me if you need help (it may not be obvious to do). |
||
| msgpack_decoded = msgpack.loads( | ||
| b64_decoded, raw=False, strict_map_key=False, unicode_errors="replace" | ||
| ) | ||
| yield data, span, msgpack_decoded | ||
| except Exception: | ||
| logger.warning( | ||
| "Failed to decode appsec payload for request %s on span %s", | ||
| request.get_rid(), | ||
| span.get_span_name(), | ||
| exc_info=True, | ||
| ) | ||
|
|
||
| def get_profiling_data(self): | ||
| yield from self.get_data(path_filters="/api/v2/profile") | ||
|
|
||
|
|
||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.