Skip to content
This repository was archived by the owner on Nov 21, 2025. It is now read-only.

ENH: Enable Moving Average Encoder in DeepMIL #872

Draft
kenza-bouzid wants to merge 14 commits intomainfrom
kenzab/moving_average_encoder
Draft

ENH: Enable Moving Average Encoder in DeepMIL #872
kenza-bouzid wants to merge 14 commits intomainfrom
kenzab/moving_average_encoder

Conversation

@kenza-bouzid
Copy link
Copy Markdown
Contributor

No description provided.

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 17, 2023

Codecov Report

Merging #872 (36e4f06) into main (da5547c) will increase coverage by 1.04%.
The diff coverage is 46.23%.

Impacted file tree graph

Flag Coverage Δ
hi-ml-cpath 75.16% <46.23%> (+2.04%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
hi-ml-cpath/src/health_cpath/utils/callbacks.py 95.50% <ø> (+4.49%) ⬆️
...i-ml-cpath/src/health_cpath/utils/deepmil_utils.py 75.00% <33.33%> (-1.41%) ⬇️
hi-ml-cpath/src/health_cpath/models/deepmil.py 81.81% <36.92%> (-12.10%) ⬇️
...src/health_cpath/configs/classification/BaseMIL.py 81.08% <41.66%> (-2.26%) ⬇️
hi-ml-cpath/src/SSL/encoders.py 100.00% <100.00%> (ø)
.../SSL/lightning_modules/byol/byol_moving_average.py 100.00% <100.00%> (ø)

... and 17 files with indirect coverage changes

def get_target_network(pl_module: pl.LightningModule) -> torch.nn.Module:
"""Return target network from pl_module.

:param pl_module: net containing target_network
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LightningModule containing ...

)
ma_tau: float = param.Number(
default=0.99,
doc="Tau parameter for moving average encoder. Default is 0.99.",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does this tau stand for? momentum? Could be specified in the docs

"""Embed instances in chunks to avoid OOM errors.

:param instances: Tensor of shape N x C x H x W where N is the bag size
:param encoder: The encoder module
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is encoder module passed here instead of using self.encoder?

self.ma_max_bag_size = ma_max_bag_size
self.apply_ma_inference = apply_ma_inference
self.ma_encoder = deepcopy(self.encoder)
set_module_gradients_enabled(self.ma_encoder, False)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps check if this is doing the right thing

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants