Skip to content
This repository was archived by the owner on Apr 17, 2023. It is now read-only.

[OTX/MPA/Semi-SL-Seg] Enable training and evaluation for semi supervised learning in segmentation & Refactor stage.py to separate task configs#103

Merged
supersoob merged 33 commits intootxfrom
semi_seg_otx
Dec 20, 2022
Merged

[OTX/MPA/Semi-SL-Seg] Enable training and evaluation for semi supervised learning in segmentation & Refactor stage.py to separate task configs#103
supersoob merged 33 commits intootxfrom
semi_seg_otx

Conversation

@supersoob
Copy link
Copy Markdown

@supersoob supersoob commented Dec 8, 2022

This PR includes

  • New segmentor -> Mean Teacher for semi-sl
  • New trainer/inferrer/stage for semi-sl
  • Refactor stage.py for each task
  • Custom pipeline methods for unlabeled data augmentation
  • Revisit is_task_adapt flag

TODOs

  • Refactor v3 - pre-commit/rename
  • Import issue in unlabeled data hook
  • Check test codes and results

@supersoob supersoob marked this pull request as ready for review December 9, 2022 07:57
Copy link
Copy Markdown
Contributor

@kprokofi kprokofi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

first bunch of comments

for (k, v) in loss_decode_u.items():
if v is None:
continue
losses[k] = (loss_decode[k] + loss_decode_u[k]*self.unsup_weight)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

losses[k] = {k : loss_decode[k] + v*self.unsup_weight for k,v in loss_decode_u.items() is v is not None}
BTW, loss is dict, because we can have several losses?
maybe we can add unlabeled data to _decode_head_forward_train and modify this function?

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this code implementation is more intuitive than dict comprehension.

Copy link
Copy Markdown
Contributor

@kprokofi kprokofi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Second bunch of comments

from mpa.utils.data_cpu import MMDataCPU


def set_random_seed(seed, deterministic=False):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(I know that it is not your code, but it is something that I noticed) This function doesn't use anywhere, even more we have set_random_seed in Stage.py based on the config. Let's delete it.

Copy link
Copy Markdown
Author

@supersoob supersoob Dec 19, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can I refactor the codes inside stage.py in the next PR? This PR scope is separating the stage.py config with semisl.

@supersoob
Copy link
Copy Markdown
Author

supersoob commented Dec 19, 2022

@jaegukhyun @kprokofi @JihwanEom I revised codes based on your reviews. I request you to review again. I resolved your comments. If you need to add comments or still have a questions to resolved question, unresolve it or add some.

@supersoob supersoob merged commit d9c4cc9 into otx Dec 20, 2022
@supersoob supersoob deleted the semi_seg_otx branch December 20, 2022 00:56
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants