Skip to content

Support for transfer learning with feature extraction #3116

@gconst02

Description

@gconst02

Is your feature request related to a problem? Please describe.
Current version of federated_client.py optimizes all parameters of the model.

self.optimizer = optimizer(model.parameters(), **optimizer_args)

For transfer learning with feature extraction, only the classifier needs to be optimized.

Describe the solution you'd like
train_config.py already accepts optimizer_args but cannot specify params parameter of the optimizer (if you do default you get TypeError: __init__() got multiple values for argument 'params' due to the default model.parameters()).

This can be modified to optionally set params if appears in optimizer_args or else use current behavior , i.e.,

optimizer = getattr(th.optim, optimizer_name)
optimizer_args.setdefault("params", model.parameters())
self.optimizer = optimizer(**optimizer_args)

Describe alternatives you've considered
None

Additional context
None

Metadata

Metadata

Assignees

No one assigned

    Labels

    Status: Stale 🍞Been open for a while with no activityType: New Feature ➕Introduction of a completely new addition to the codebase

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions