-
-
Notifications
You must be signed in to change notification settings - Fork 2k
Closed
Labels
Status: Stale 🍞Been open for a while with no activityBeen open for a while with no activityType: New Feature ➕Introduction of a completely new addition to the codebaseIntroduction of a completely new addition to the codebase
Description
Is your feature request related to a problem? Please describe.
Current version of federated_client.py optimizes all parameters of the model.
self.optimizer = optimizer(model.parameters(), **optimizer_args)
For transfer learning with feature extraction, only the classifier needs to be optimized.
Describe the solution you'd like
train_config.py already accepts optimizer_args but cannot specify params parameter of the optimizer (if you do default you get TypeError: __init__() got multiple values for argument 'params' due to the default model.parameters()).
This can be modified to optionally set params if appears in optimizer_args or else use current behavior , i.e.,
optimizer = getattr(th.optim, optimizer_name)
optimizer_args.setdefault("params", model.parameters())
self.optimizer = optimizer(**optimizer_args)
Describe alternatives you've considered
None
Additional context
None
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
Status: Stale 🍞Been open for a while with no activityBeen open for a while with no activityType: New Feature ➕Introduction of a completely new addition to the codebaseIntroduction of a completely new addition to the codebase