Skip to content

Conversation

@ocefpaf
Copy link
Member

@ocefpaf ocefpaf commented Jan 28, 2026

I'm not sure what is needed for the codecov GitHub Actions to work b/c all the other project I implemented this already used it. Reading the docs now... Maybe nothing is needed.

Closes #98

@ocefpaf
Copy link
Member Author

ocefpaf commented Jan 28, 2026

We will need a token for the coverage upload: https://github.com/codecov/codecov-action?tab=readme-ov-file#usage

@ChrisBarker-NOAA what do you want to do here? Just a plain table in the logs of the CI or go for the codecov token? If the former we can change the report from an XML to a simple text table> If the latter, I don't believe I don't have admin rights here to do that.

@ChrisBarker-NOAA
Copy link
Contributor

I'm happy with a simple report in the CI.

codecov.io looks pretty cool though -- if you think it really adds value then we could do that. I think you can create a token as easily as I can. your call.

@ocefpaf
Copy link
Member Author

ocefpaf commented Feb 4, 2026

if you think it really adds value then we could do that.

I'm not a big fan of it for small projects. IMO, one less service is one less thing we need to worry about breaking, leaking secrets, etc. I do see some value for projects with +10 developers though.

@ChrisBarker-NOAA
Copy link
Contributor

Sounds good to me -- let's keep it simple.

In that case, does it make sense to have a separate coverage job? rather than simply running the tests with --cov and checking (or publishing somewhere) the coverage report ?

@pmav99
Copy link

pmav99 commented Feb 5, 2026

@ChrisBarker-NOAA the simplest solution is usually to just fail the tests if coverage drops below an acceptable percentage: https://stackoverflow.com/a/70511180/592289

@ChrisBarker-NOAA
Copy link
Contributor

well, yes, but at this point coverage is pathetic in this project -- so I want to see the report, and ideally get a warning, but not have it fail.

@ocefpaf
Copy link
Member Author

ocefpaf commented Feb 5, 2026

the report will be shown in the CI, like https://github.com/ioos/xarray-subset-grid/actions/runs/21719434840/job/62644622607?pr=99#step:4:31

This one is ready for review. I had to pin pytest-cov to <7 b/c for some odd reason the latest one always hangs.

Copy link
Contributor

@ChrisBarker-NOAA ChrisBarker-NOAA left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

lint = "ruff check tests xarray_subset_grid"
test = "pytest tests/"
test_all = "pytest --online tests/"
test_cov = "pytest --cov=xarray_subset_grid tests"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one note here -- there is an --online flag to run all the tests, including ones that need access to the internet to work (pulling remote data).

However, I'm not sure we should run that here -- those tests are really slow, and sometimes fail.

Better to increase coverage without the online tests.

so I think we should keep this as it is now.

@ChrisBarker-NOAA
Copy link
Contributor

I don't see a reason not to merge this -- thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add coverage report to CI

3 participants