Skip to content

FLOptimizer #3141

@iamtrask

Description

@iamtrask

Is your feature request related to a problem? Please describe.
Some optimizers (such as Adam) do not work with Federated Learning by default because they maintain a list of the N recent gradietns in a cache. For example, take this demo (https://github.com/OpenMined/PySyft/blob/master/examples/tutorials/Part%2002%20-%20Intro%20to%20Federated%20Learning.ipynb) and replace the optimizer with Adam and you'll geet a bug.

The solution is to have a separate optimizer for each worker in the FL process which you use at the appropriate time.

Describe the solution you'd like

Obviously this would be really annoying to have to write by hand every time... so please create a FLOptimizer which will - under the hood - maintain a list of optimizer objects, one for each worker - and use them in the appropriate context,.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Good first issue 🎓Perfect for beginners, welcome to OpenMined!Type: New Feature ➕Introduction of a completely new addition to the codebase

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions