Skip to content

chore(trainer): add unit tests for get_args_using_torchtune_config and get_trainer_cr_from_builtin_trainer#416

Open
prabhaharanv wants to merge 1 commit intokubeflow:mainfrom
prabhaharanv:test/add-builtin-trainer-utils-coverage
Open

chore(trainer): add unit tests for get_args_using_torchtune_config and get_trainer_cr_from_builtin_trainer#416
prabhaharanv wants to merge 1 commit intokubeflow:mainfrom
prabhaharanv:test/add-builtin-trainer-utils-coverage

Conversation

@prabhaharanv
Copy link
Copy Markdown

@prabhaharanv prabhaharanv commented Mar 23, 2026

What this PR does

Adds comprehensive unit tests for two previously untested utility functions in kubeflow/trainer/backends/kubernetes/utils.py:

test_get_args_using_torchtune_config — 8 parametrized test cases:

  1. Empty TorchTuneConfig produces empty args
  2. dtype only produces single dtype arg
  3. All scalar fields (dtype, batch_size, epochs, loss)
  4. peft_config with LoraConfig appends LoRA args
  5. dataset_preprocess_config appends dataset args
  6. HuggingFace dataset URI — directory path via initializer (hf://tatsu-lab/alpaca)
  7. HuggingFace dataset URI — file path via initializer (hf://tatsu-lab/alpaca/data.json)
  8. Invalid dtype raises ValueError

test_get_trainer_cr_from_builtin_trainer — 4 parametrized test cases:

  1. Config with num_nodes and batch_size
  2. Config with resources_per_node
  3. Empty config produces trainer with empty args
  4. Invalid config type raises ValueError

Why

Both get_args_using_torchtune_config() and get_trainer_cr_from_builtin_trainer() had zero test coverage — confirmed via grep across the entire test suite. These functions are critical for converting SDK BuiltinTrainer types into Kubernetes CR models and CLI arguments for torchtune.

How

  • Tests follow existing patterns in utils_test.py using TestCase dataclass and pytest.mark.parametrize
  • Added helper _build_builtin_runtime() to construct a torchtune runtime with TORCH_TUNE_COMMAND
  • All 93 tests pass (81 existing + 12 new), zero regressions
  • Passes ruff check, ruff format --check, and pre-commit hooks

Copilot AI review requested due to automatic review settings March 23, 2026 08:11
@github-actions
Copy link
Copy Markdown
Contributor

🎉 Welcome to the Kubeflow SDK! 🎉

Thanks for opening your first PR! We're happy to have you as part of our community 🚀

Here's what happens next:

  • If you haven't already, please check out our Contributing Guide for repo-specific guidelines and the Kubeflow Contributor Guide for general community standards
  • Our team will review your PR soon! cc @kubeflow/kubeflow-sdk-team

Join the community:

Feel free to ask questions in the comments if you need any help or clarification!
Thanks again for contributing to Kubeflow! 🙏

@google-oss-prow
Copy link
Copy Markdown
Contributor

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign kramaranya for approval. For more information see the Kubernetes Code Review Process.

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds unit tests to improve coverage for TorchTune/BuiltinTrainer Kubernetes utility conversions, ensuring TorchTuneConfig fields and BuiltinTrainer CR generation behave as expected.

Changes:

  • Added _build_builtin_runtime() helper for constructing a TorchTune BuiltinTrainer runtime in tests.
  • Added parametrized tests for utils.get_args_using_torchtune_config().
  • Added parametrized tests for utils.get_trainer_cr_from_builtin_trainer().

Comment on lines +862 to +866
@pytest.mark.parametrize(
"test_case",
[
TestCase(
name="empty config produces empty args",
Copy link

Copilot AI Mar 23, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR description lists several test scenarios (e.g., “model_args only” and a “full config with all fields”) that aren’t represented in this parametrization, and TorchTuneConfig doesn’t appear to have a model_args field; please add the missing cases or update the PR description to match what’s actually tested.

Copilot uses AI. Check for mistakes.
…d get_trainer_cr_from_builtin_trainer

Add comprehensive unit tests for two previously untested utility functions
in kubeflow/trainer/backends/kubernetes/utils.py:

- test_get_args_using_torchtune_config: 8 parametrized test cases covering
  empty config, dtype only, all scalar fields (dtype/batch_size/epochs/loss),
  LoRA peft_config, dataset_preprocess_config, HuggingFace dataset URIs
  (directory and file paths via initializer), and invalid dtype error.

- test_get_trainer_cr_from_builtin_trainer: 4 parametrized test cases
  covering num_nodes with batch_size, resources_per_node, empty config,
  and invalid config type error.

These functions had zero test coverage. The new tests follow existing
patterns in the file using TestCase dataclass and pytest.mark.parametrize.

Signed-off-by: Prabhaharan Velu <haranprabha.v@gmail.com>
@prabhaharanv prabhaharanv force-pushed the test/add-builtin-trainer-utils-coverage branch from f76b5fa to c720be2 Compare March 23, 2026 21:05
@prabhaharanv prabhaharanv requested a review from Copilot March 24, 2026 04:32
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 1 out of 1 changed files in this pull request and generated no new comments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants