Unit tests#83
Conversation
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Anh-Uong <anh.uong@ibm.com>
|
This PR looks good to me. Before we add edge cases, @tharapalanivel can we also add a unit test for fine tuning? there will not be any peft type associated with it |
| accelerate>=0.20.3 | ||
| packaging | ||
| transformers>=4.34.1 | ||
| transformers>=4.34.1,<4.38.0 |
There was a problem hiding this comment.
#53 is merged so you shouldnt need this cap, thanks
There was a problem hiding this comment.
@Ssukriti Should we keep the cap(or a static version) for the transformers package avoid un intended errors like xla_fsdp_v2. We could create github workflow to run tests and then update the cap regularly.
There was a problem hiding this comment.
we dont need to keep static version , but yes in optional dependencies PR , @gkumbhat is looking into how to cap and we may cap to next major release. Now that CI/CD with automatically pull new release versions , if we see failing builds, we will update accordingly
the errors we were seeing with xla_fsdp_v2 was actually due to code we wrote , which was good to catch and fix . It was not a API change from transformers, but we were setting env variables incorrectly
There was a problem hiding this comment.
In general if there is a specific version that doesn't work, or has a bug ,then we can also ask pip to ignore that particular version.
add more unit tests and refactor
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
| assert "Simply put, the theory of relativity states that" in output_inference | ||
|
|
||
|
|
||
| def test_run_train_lora_target_modules(): |
There was a problem hiding this comment.
whats the difference between this and above test? can we combine to 1?
There was a problem hiding this comment.
My understanding is that first we check that default target modules are used, then the next one is for custom target modules specified by user and the last for all-linear. I've parameterized it but worth confirming with Anh.
Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com>
| @@ -0,0 +1,22 @@ | |||
| # Copyright The IBM Tuning Team | |||
There was a problem hiding this comment.
Curious about the copyright notice..Where is this coming from?
There was a problem hiding this comment.
IBM Tuning Team was suggested by Raghu, the rest is from caikit
| invalid_params | ||
| ) | ||
|
|
||
| with pytest.raises(ValueError, match=exc_msg): |
There was a problem hiding this comment.
I generally avoid matching exact error message and just check for valueError with a comment explaining why, but will let this go and I dont think we will update the message much
* Set up fixtures and data for tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add basic unit tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Setting upper bound for transformers Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Ignore aim log files Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Include int num_train_epochs Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add copyright notice Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Run inference on tuned model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Trainer downloads model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * add more unit tests and refactor Signed-off-by: Anh-Uong <anh.uong@ibm.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add FT unit test and refactor Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Removing transformers upper bound cap Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> --------- Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> Signed-off-by: Anh-Uong <anh.uong@ibm.com> Co-authored-by: Anh-Uong <anh.uong@ibm.com>
* Set up fixtures and data for tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add basic unit tests Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Setting upper bound for transformers Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Ignore aim log files Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Include int num_train_epochs Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add copyright notice Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Run inference on tuned model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Trainer downloads model Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * add more unit tests and refactor Signed-off-by: Anh-Uong <anh.uong@ibm.com> * Fix formatting Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Add FT unit test and refactor Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Removing transformers upper bound cap Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> * Address review comments Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> --------- Signed-off-by: Thara Palanivel <130496890+tharapalanivel@users.noreply.github.com> Signed-off-by: Anh-Uong <anh.uong@ibm.com> Co-authored-by: Anh-Uong <anh.uong@ibm.com>
Description of the change
Adding unit tests for
ptandloratuning method using dummy model, edge cases, invalid requests, etc.Cont. of PR #79
Related issue number
Closes #74
How to verify the PR
Was the PR tested