[go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reworked deep supervision #1740

Merged
merged 7 commits into from
Dec 5, 2023
Merged

Conversation

TaWald
Copy link
Member
@TaWald TaWald commented Oct 11, 2023

Currently deepsupervision is half-static in the code and is annoying to turn on/off.
I introduced self.enable_deep_supervision which is now a attribute, accessible in the class methods.

The deep_supervision flag itself is used at different positions:

  1. def _build_loss, which is a class method, that (currently always) wraps the loss with a deep supervision wrapper. Now instead of wrapping always it checks if the flag is set and only wraps then.
  2. def build_network_architecture. Here it's hard baked and one cannot change it. If one would like to make a non_DS architecture one would have to overload the entire initialize 💩 -- now its using the class attribute
  3. def _get_deep_supervision_scales This used to always output scales of the DS, now it's conditionally outputting either the scales, or None (which is relevant for (4)
  4. def get_training_transforms (and validation transforms) here one has to pass None for the DataAug to not return a list of labels on the wanted scale, originally one would have to overload the _get_deep_supervision_scales which is unnecessary now.
  5. def on_train_start(self): uses a fixed value to set ddp to deep supervision, now depends on attribute
  6. def validation_step used to always take the highest resolution (of the list), now it's conditional so no overloading of the validation step when one uses DS or not

Now the NoDeepSupervisionTrainer is basically only setting deep_supervision to False, which imo is way cleaner than before, but you can judge for yourself. Also all the downstream Trainers that changed the above mentioned functions were updated to be compatible to the new structure. (also since basically all changes are (if self.enable_deep_supervision: [...] it should marginally influence systembehavior when left at default value of true in__init__

(P.S. When changing the files black formatted automatically, hence it looks like more changes than it actually is.)

Verifies the npys are readable and if not,
tries to reextract from .npz.
Otherwise faulty extraction can lead to
errors once the broken case is sampled by
dataloading.
Previously TrainerNoDeepSupervision was broken,
now it works and is easier to control in inheritance
into deepsupervisionfix
@TaWald TaWald added the enhancement New feature or request label Oct 11, 2023
@FabianIsensee FabianIsensee merged commit dcf9964 into MIC-DKFZ:master Dec 5, 2023
1 check passed
@FabianIsensee
Copy link
Member

Thanks, nice work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants