Skip to content

feat: add LeRobot v3 metadata validation checks#8

Open
kck325 wants to merge 5 commits intomainfrom
chandra/lerobot-v3-metadata-checker
Open

feat: add LeRobot v3 metadata validation checks#8
kck325 wants to merge 5 commits intomainfrom
chandra/lerobot-v3-metadata-checker

Conversation

@kck325
Copy link
Contributor

@kck325 kck325 commented Mar 18, 2026

Summary

Add LerobotV3MetadataChecker — 7 pre-ingestion validation checks for LeRobot v3 datasets:

  1. tasks.parquet — must exist (flags tasks.jsonl-only datasets)
  2. Episodes parquet columns — requires data/chunk_index, data/file_index, tasks
  3. Feature shapes — no empty shapes ([] should be [1])
  4. Path templatesdata_path and video_path must use standard placeholders ({episode_chunk}, {episode_index}, {video_key})
  5. Video file existence — all files referenced by episodes must exist
  6. Timestamp consistency — all relative or all absolute across data parquet files
  7. Episode contiguity — rows for each episode_index must be contiguous

Motivation

These checks catch data issues at sharing time rather than at conversion time, preventing runtime fixups in downstream consumers. Partners can validate their datasets before uploading.

Test plan

  • 17 new tests covering all 7 checks (pass and fail cases)
  • Updated existing test fixtures to include v3 metadata
  • All 51 tests pass with zero regressions

🤖 Generated with Claude Code

kck325 and others added 4 commits March 18, 2026 17:57
Add LerobotV3MetadataChecker with 7 pre-ingestion validation checks:
1. tasks.parquet exists (flags tasks.jsonl-only datasets)
2. Episodes parquet has required columns (chunk_index, file_index, tasks)
3. Feature shapes are non-empty ([] should be [1])
4. File path templates use standard placeholders
5. Video files referenced by episodes exist
6. Timestamp consistency (all relative or all absolute)
7. Episode row contiguity in data parquet

Integrated into the main validation orchestrator. Includes 17 new
tests and updated existing test fixtures to include v3 metadata.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Null/missing start_timestamp values are now flagged as errors instead of
being silently skipped. Downstream converters require a valid collection
timestamp for every episode — catching this at validation time prevents
runtime failures.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add two new validation checks to catch dataset issues that cause
runtime failures in the LeRobot MCAP converter:

8. Data parquet must not contain video struct columns — these cause
   CastError when the dataset loader tries to match features schema.

9. Episode parquet must include videos/{key}/chunk_index and
   videos/{key}/from_timestamp for each video feature — the dataset
   loader needs these to resolve video files and timestamps.

Both issues were discovered during fpvlabs dataset ingestion and
required workarounds in the converter.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add 6 P0 validators as lerobot_validator/v3_checks.py to catch the most
common data quality issues before partner upload:

- V1  validate_tasks_format: error if no tasks file, warn if only jsonl
- V2  validate_codebase_version: require codebase_version starts with v3.
- V5  validate_feature_shapes: reject shape=[], require 3-element image shapes
- V7  validate_timestamps: reject absolute Unix epoch in data parquets
- V11 validate_custom_metadata_csv: require episode_index/episode_id, reject
      null/duplicate episode_ids
- V12 validate_start_timestamp: require plausible Unix epoch floats

Wire validate_v3_dataset() into the LerobotDatasetValidator orchestrator
so errors surface automatically, and add get_warnings() support. Update
existing test fixtures to include codebase_version so integration tests
pass with the new checks.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…es, share CSV reads

- Import REQUIRED_METADATA_COLUMNS from schemas.py instead of hardcoding
- Add Issue.error() / Issue.warning() factory methods to reduce boilerplate
- Share CSV DataFrame between V11 and V12 via _df_cache to avoid redundant reads
- validate_timestamps checks only first episode for monotonicity (not all)
- Remove unused `thorough` parameter from validate_v3_dataset
- Make timestamp constants public (UNIX_EPOCH_THRESHOLD, UNIX_EPOCH_MAX)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant