Conversation
- Add line numbers for JSON responses - Add Copy JSON button with clipboard functionality - Add Wrap Lines toggle for long JSON paths - Improve typography and syntax highlighting - Fix response body display issues - Auto-detect and enhance JSON blocks in response area only
- Add customCss for Scalar JSON response styling - Improve Swagger UI DOM ready handling - Add better error handling for Swagger UI - Fix resource loading timing issues
…ment_Sang-Pham Cu 86ewd2pew instance state management sang pham
…T-Instance-Management_Sang-Pham
…ement_Sang-Pham Cu 86ewd2pry secu rt instance management sang pham
…nt_Sang-Pham Cu 86ewd2pv4 secu rt areas management sang pham
- Add securt_line_handler, manager, storage, and types - Update analytics_entities_manager to support lines - Update OpenAPI specifications - Update CMakeLists.txt and main.cpp
…-Pham - Resolved conflicts by keeping both Lines and Areas features - Updated analytics_entities_manager to support both managers - Updated CMakeLists.txt to include both handlers - Updated OpenAPI specs to include both schemas
- Add all SecuRT Lines endpoints (POST/PUT/DELETE for counting, crossing, tailgating) - Add LineWrite, TailgatingLineWrite, LineRead schemas - Fix YAML syntax error in description field - Update all line endpoints tags to 'Lines SecuRT' for proper grouping in Scalar
- Add SecuRT Lines types (counting, crossing, tailgating) - Implement SecuRTLineStorage, SecuRTLineManager, SecuRTLineHandler - Add CRUD endpoints for all line types - Update API documentation with all endpoints and schemas - Integrate lines into AnalyticsEntitiesManager
…nt_Sang-Pham Cu 86ewd2pxa secu rt lines management sang pham
- Add feature extraction, attributes extraction, performance profile - Add face detection, LPR, PIP, surrender detection - Add motion area, masking areas, exclusion areas management - Update OpenAPI spec with new endpoints - Add manual test documentation
- Phase 1: Tách utilities (model resolver, request utils) - Phase 2: Tách source/destination nodes - Phase 3: Tách detector nodes (29 functions) - Giảm pipeline_builder.cpp từ 8116 -> 6314 dòng
- Phase 4: Tách broker nodes (15 functions) - Phase 4: Tách behavior analysis nodes (6 functions) - Phase 4: Tách other nodes (OSD, Tracking - 3 functions) - Giảm pipeline_builder.cpp từ 6314 -> 4061 dòng
- Thêm securt_line_handler - Cập nhật securt_instance_manager - Thêm securt_pipeline_integration - Cập nhật lines_handler, solution_registry, instance_registry
…directions - Parse CrossingLines JSON to extract line names, colors, and directions - Use crossline_config API to store line metadata in ba_crossline_node - Set line_display_config for ba_crossline_osd_node to display custom names - Support unlimited lines per channel (default to channel 0) - All lines are added using add_line() API with full configs - Improved logging to show total lines parsed and added - Fix: Lines now display custom names instead of 'Line 0', 'Line 1', etc. - Fix: Multiple lines are properly displayed on video output
- Fix OpenCV freetype library copy in postinst script: * Prioritize actual files (.4.10.0) over symlinks * Properly resolve symlink chains before copying * Improve error reporting and verification * Add directory write permission check - Create fix_freetype.sh script for manual fixing: * Auto-detect and copy OpenCV 4.10 freetype library * Set correct permissions and update ldconfig * Restart service automatically * Include in package via debian/rules - Enforce OpenCV 4.10 requirement (remove OpenCV 4.6 fallback): * Remove all OpenCV 4.6 fallback paths from all scripts * Update build_deb_all_in_one.sh to only search for OpenCV 4.10 * Update debian/rules to only use OpenCV 4.10 * Clear error messages when OpenCV 4.10 is not found - Update build_deb_all_in_one.sh: * Improve OpenCV 4.10 freetype detection * Support .4.10.0, .4.10, and .410 file variants * Better symlink resolution with cp -L This ensures CVEDIX SDK compatibility requires OpenCV 4.10 with freetype support.
- Add rtsp_des to optional nodes list in pipeline builder - Return nullptr with warning if CVEDIX_WITH_GSTREAMER not set - Pipeline continues gracefully when RTSP destination unavailable
Made-with: Cursor
Made-with: Cursor
Made-with: Cursor
…essor Made-with: Cursor
…GNIZER_PATH Made-with: Cursor
Made-with: Cursor
…dler uses facade Made-with: Cursor
Made-with: Cursor
…ntime, ENV vars, vision & design docs Made-with: Cursor
- Package, service, binary: edgeos-api, edgeos-worker - Đường dẫn cài đặt: /opt/edgeos-api - Debian: PACKAGE_NAME edgeos-api, INSTALL_DIR/LIB_DIR /opt/edgeos-api - Env: EDGEOS_API_INSTALL_DIR (fallback EDGE_AI_API_INSTALL_DIR) - Đổi tên file: edge_ai_worker.cpp -> edgeos_worker.cpp - Thư viện: libedgeos_core.so - Cập nhật README, examples, api-specs, tests, packaging, deploy, scripts Made-with: Cursor
- Add setupRTMPDestinationActivityHook: set stream_status_hooker and meta_handled_hooker on rtmp_des nodes to call updateRTMPDestinationActivity when frames are actually pushed, not only when app_des has frame to cache. - Call setupRTMPDestinationActivityHook after setupFrameCaptureHook when starting pipeline. - Set same hooks on new rtmp_des node after reconnect so reconnected destination also reports activity correctly. - Fixes false 'no activity' and premature stop of reconnect attempts when source produces empty frames (e.g. gst_sample_get_caps failure). Made-with: Cursor
…hecklist) Made-with: Cursor
…cklist) Made-with: Cursor
…ecklist) Made-with: Cursor
…le instances Made-with: Cursor
…s and index Made-with: Cursor
…list updates - Add ba_crowding solution (file_src → yolo → sort_track → ba_crowding → ba_crowding_osd → file_des) - Add createBACrowdingNode/createBACrowdingOSDNode with CrowdingZones JSON parsing - Add example instance ba_crowding/yolo/example_ba_crowding_file.json and READMEs - Checklist: fix #48 mllm_analysis path, add ba_crowding (#12), document ba_movement/ba_speed remaining Made-with: Cursor
…ates - Add rtmp_des_lastframe_fallback_node, rtmp_lastframe_fallback_proxy_node - Update pipeline_builder, destination_nodes, detector_nodes, system_config - Update instance_registry_rtmp_monitor, CMakeLists, tests Made-with: Cursor
…file input, RTMP reconnect - Remove double decode: validate only in handler, decode once in FrameProcessor - Frame queue default 30, pop(100ms), backpressure 503 when queue >= 80% - file_des only when input is file path; RTMP/RTSP use matching destination only - RTMP reconnect init default 10s; SIGSEGV message clarity; docs cleanup Made-with: Cursor
- Fix rtmp_des proxy attach: use correct previous node when extraNodes (proxy+rtmp_des) so proxy receives frames from OSD/sort_track - Treat CROSSLINE_START_X/Y, CROSSLINE_END_X/Y as crossline config so ba_crossline and ba_crossline_osd are not skipped when CrossingLines JSON is absent - Add rtmpPlaybackUrl in instance output (API): SDK adds _0 to stream key; return playback URL for viewing (instance_handler + ai_handler) Made-with: Cursor
…instance đến khi running Made-with: Cursor
…createNode và createCrosslineMQTTBrokerNode Made-with: Cursor
…untime-multiple-process-cho-phien-ban-tiep-theo_Sang-Pham_2 Cu 86evxtxeu cap nhat logic chay runtime multiple process cho phien ban tiep theo sang pham 2
| mqtt_cfg = get_mqtt_config_from_params(config.get("additionalParams") or {}) | ||
| if not args.no_mqtt and _have_paho and mqtt_cfg: | ||
| broker, port, topic, user, password = mqtt_cfg | ||
| print(f"Subscribing to MQTT: {broker}:{port} topic={topic}") |
Check failure
Code scanning / CodeQL
Clear-text logging of sensitive information High test
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 8 days ago
In general, the fix is to ensure that no sensitive data (such as MQTT passwords) is written to logs in clear text. This can be done by either not logging those values at all or by masking/redacting them. In this specific case, we only need to adjust the print statement that logs MQTT subscription details so that it does not risk exposing any part of a tainted configuration tuple. We should keep the existing functionality (subscribing to MQTT) unchanged while reducing logged detail to what is strictly necessary and non-sensitive.
Concretely, in tests/benchmark/run_benchmark.py, around line 528, we should change the print(f"Subscribing to MQTT: {broker}:{port} topic={topic}") to a version that does not include any tainted fields or at least avoids detailed endpoint information derived from the tainted tuple. A simple and safe option is to log only that an MQTT subscription is being set up, optionally including the port (which is generic and not secret) or a static message. This keeps behavior intact while preventing future accidental introduction of {password} or other sensitive values into the log line. No new methods or imports are needed; we are only modifying the string being printed.
| @@ -525,7 +525,7 @@ | ||
| mqtt_cfg = get_mqtt_config_from_params(config.get("additionalParams") or {}) | ||
| if not args.no_mqtt and _have_paho and mqtt_cfg: | ||
| broker, port, topic, user, password = mqtt_cfg | ||
| print(f"Subscribing to MQTT: {broker}:{port} topic={topic}") | ||
| print("Subscribing to MQTT events") | ||
| with mqtt_events_lock: | ||
| mqtt_events.clear() | ||
| mqtt_thread = run_mqtt_subscriber(broker, port, topic, user, password) |
| mqtt_cfg = get_mqtt_config_from_params(config.get("additionalParams") or {}) | ||
| if not args.no_mqtt and _have_paho and mqtt_cfg: | ||
| broker, port, topic, user, password = mqtt_cfg | ||
| print(f"Subscribing to MQTT: {broker}:{port} topic={topic}") |
Check failure
Code scanning / CodeQL
Clear-text logging of sensitive information High test
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 8 days ago
In general, to fix clear-text logging of sensitive information, identify any log statements that might include secrets (passwords, API keys, tokens, etc.) and either remove those fields from logs or replace them with redacted placeholders. When working with objects or tuples that mix sensitive and non-sensitive data, avoid logging the whole structure and instead log only explicitly non-sensitive fields.
For this specific case in tests/benchmark/run_benchmark_generic.py, the best fix without changing functionality is:
- Keep reading and using
MQTT_PASSWORDto connect, but never log it. - Make the log statement explicitly log only non-sensitive information, and make it clear that credentials are not logged.
- Optionally annotate in the log that credentials are omitted/redacted to avoid future developers accidentally adding them.
Concretely:
- Keep
get_mqtt_config_from_paramsas-is (lines 201–220), since it only parses configuration and does not log. - Change the
printon line 483 so that it still logs the connection target (broker, port, topic) but makes clear that credentials are not included, and ensure no credentials are interpolated now or in the future.
No new methods are needed; only a small change to the existing print statement is required, within tests/benchmark/run_benchmark_generic.py.
| @@ -480,7 +480,7 @@ | ||
| mqtt_cfg = get_mqtt_config_from_params(config.get("additionalParams") or {}) | ||
| if not args.no_mqtt and _have_paho and mqtt_cfg: | ||
| broker, port, topic, user, password = mqtt_cfg | ||
| print(f"Subscribing to MQTT: {broker}:{port} topic={topic}") | ||
| print(f"Subscribing to MQTT broker={broker}:{port}, topic={topic} (credentials not logged)") | ||
| with mqtt_events_lock: | ||
| mqtt_events.clear() | ||
| mqtt_thread = run_mqtt_subscriber(broker, port, topic, user, password) |
No description provided.