Skip to content

added floptimizer to maintain a list of optimizer objects#3179

Merged
iamtrask merged 17 commits intoOpenMined:masterfrom
rimijoker:master
Mar 14, 2020
Merged

added floptimizer to maintain a list of optimizer objects#3179
iamtrask merged 17 commits intoOpenMined:masterfrom
rimijoker:master

Conversation

@rimijoker
Copy link
Member

FLOptimizer which will - under the hood - maintain a list of optimizer objects #3141
Made a FlOptimizer.py to maintain a list of optimizer objects, one for each worker and use them in the appropriate context.
Changed tutorial 2 to implement the same for demonstration
Fixes #3141

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

You'll be able to see Jupyter notebook diff and discuss changes. Powered by ReviewNB.

@rimijoker rimijoker changed the title added FLOptimizer to maintain a list of optimizer objects added FLOptimizer to maintain a list of optimizer objects #3141 Mar 11, 2020
@rimijoker rimijoker changed the title added FLOptimizer to maintain a list of optimizer objects #3141 added FLOptimizer to maintain a list of optimizer objects Mar 11, 2020
@rimijoker rimijoker changed the title added FLOptimizer to maintain a list of optimizer objects [WIP]added FLOptimizer to maintain a list of optimizer objects Mar 11, 2020
@rimijoker rimijoker changed the title [WIP]added FLOptimizer to maintain a list of optimizer objects added floptimizer to maintain a list of optimizer objects Mar 11, 2020
@ratmcu
Copy link
Contributor

ratmcu commented Mar 11, 2020

I developed this class for the FLOptimizer,
It worked on the example in the tutorial, I couldn't elegantly get around sending arbitrary amount of arguments, may be it would be good to inherit from this class and create a new class for each optimizer, I hopewe can keep this and smooth passing of the cached averages between workers as TODOs.

model = nn.Linear(2,1)
from torch import optim
class FLOptimier():
    # def __init__(self, optimizer_class, **kwargs):
    def __init__(self, optimizer_class, lr = 0.1):
        self.optimizer_class = optimizer_class
        self.opt_dict = defaultdict()
        # self.kwargs = kwargs
        self.lr = lr
    def get_optimizer(self, model):
        if hasattr(model, 'location'):
            opt = self.opt_dict.setdefault(model.location, self.optimizer_class(model.parameters(), lr = self.lr)) 
            # opt.step()
            return opt
        opt = self.opt_dict.setdefault('central', self.optimizer_class(model.parameters(), lr = self.lr)) 
        return opt
def train():
    # Training Logic
    fl_opt = FLOptimier(optim.Adam, lr=0.1)
    loc  = ''
    for iter in range(10):        
        # NEW) iterate through each worker's dataset
        for data,target in datasets:
            
            # NEW) send model to correct worker
            model.send(data.location)
            # if data.location == bob:
            #     opt = opt_bob
            # if data.location == alice:
            #     opt = opt_alice
            opt = fl_opt.get_optimizer(model)
            # 1) erase previous gradients (if they exist)
            opt.zero_grad()
            # 2) make a prediction
            pred = model(data)
            # 3) calculate how much we missed
            loss = ((pred - target)**2).sum()
            # 4) figure out which weights caused us to miss
            loss.backward()
            # 5) change those weights
            opt.step()
            
            # NEW) get model (with gradients)
            model.get()
            # opt.set_new_params(model.parameters())
            # 6) print our progress
            print(loss.get()) # NEW) slight edit... need to call .get() on loss\
# federated averaging
train()

also where can a class like this go in the PySyFt?

@rimijoker rimijoker changed the title added floptimizer to maintain a list of optimizer objects [WIP]added floptimizer to maintain a list of optimizer objects Mar 12, 2020
@rimijoker
Copy link
Member Author

@ratmcu In the federated dir maybe :-)

@rimijoker rimijoker changed the title [WIP]added floptimizer to maintain a list of optimizer objects added floptimizer to maintain a list of optimizer objects Mar 12, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

FLOptimizer

3 participants