Skip to content

model.get() causes RuntimeError if the code is running on GPU with resnet model #4568

@Hjeljeli

Description

@Hjeljeli

Description

After I train the model locally by a worker, I do model.get() to retrieve it and I have the following runtime error: "Expected object of device type cuda but got device type cpu for argument #1 'self' in call to th_set".
I am training on GPU (same code runs perfectly if I use CPU) and I am using resnet50 model.

How to Reproduce

optimizer = optim.Adam(model.parameters(), lr=lr) 
criterion = nn.CrossEntropyLoss()

model.train()
model.send(worker)
for batch_idx, (data, target) in enumerate(batches):
        data, target = data.to(device), target.to(device)
        optimizer.zero_grad()
        output = model(data)
        loss = criterion(output, target)
        loss.backward()
        optimizer.step()
        loss = loss.get()
        model.get()  # <-- This get causes the error

System Information

  • syft: 0.2.9

Metadata

Metadata

Assignees

No one assigned

    Labels

    0.2.xRelating to the 0.2.x code branchType: Bug 🐛Some functionality not working in the codebase as intendedhacktoberfest

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions