Skip to content

Chebyshev or Sigmoid for tanh#3113

Merged
LaRiffle merged 6 commits intoOpenMined:masterfrom
gmuraru:gm-improve-tanh
Mar 9, 2020
Merged

Chebyshev or Sigmoid for tanh#3113
LaRiffle merged 6 commits intoOpenMined:masterfrom
gmuraru:gm-improve-tanh

Conversation

@gmuraru
Copy link
Member

@gmuraru gmuraru commented Feb 27, 2020

Added the reference for the paper where the polynomial approximations are discussed.
Here is the paper (it is also added as a comment)

They talk about more functions and how to approximate those - maybe it is worth to have approximations for more functions?

Added tests and the possibility to choose between Chebyshev or Sigmoid for computing the hyperbolic tangent.

@gmuraru gmuraru changed the title [WIP] Cebashev update aprox for tanh Chebyshev or Sigmoid for tanh Mar 5, 2020
return sy.MultiPointerTensor(children=pointers)

def zero(self):
def zero(self, shape=None):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needed this in case I have a matrix multiplication - in that case the dimension for the zero shares might be different.

alice, bob, james = workers["alice"], workers["bob"], workers["james"]

fix_prec_tolerance_by_method = {
"chebyshev": {3: 3 / 100, 4: 3 / 100, 5: 3 / 100},
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those might need more tweaking - currently, it works, but we might want a lower bound

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's ok I guess!

Copy link
Contributor

@LaRiffle LaRiffle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Small things around the tensor chain and the use of extra wrappers, and then we're good :)

crypto_provider=self.crypto_provider,
**no_wrap,
).child
elif isinstance(other, AdditiveSharingTensor):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not supposed to happen: the decorator makes sure that the AST other is replaced by his .child

So if you still have an AST it means that you've operated something like [syfttype]>AST>... with the self AST

What is the usecase for this?


for worker, share in it:
cmd_res = cmd(share, other)
res[worker] = (cmd(share, other) + zero_shares[worker]) % self.field
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you put all in the same loop (and add a flag to notice when it's the first iteration) ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure


if isinstance(self.child, AdditiveSharingTensor) and isinstance(other.child, torch.Tensor):
# If we try to matmul a FPT>(wrap)>AST with a FPT>torch.tensor,
# If we try to matmul a FPT>(wrap)>AST with a FPT>(wrap)>torch.tensor,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, FPT>(wrap)>torch.tensor is forbidden!

But actually FPT>(wrap)>AST is not correct anymore, now we do FPT>AST directly

self.child, torch.Tensor
):
# If we try to matmul a FPT>torch.tensor with a FPT>(wrap)>AST,
# If we try to matmul a FPT>(wrap)>torch.tensor with a FPT>(wrap)>AST,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same here

alice, bob, james = workers["alice"], workers["bob"], workers["james"]

fix_prec_tolerance_by_method = {
"chebyshev": {3: 3 / 100, 4: 3 / 100, 5: 3 / 100},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's ok I guess!

elif isinstance(other.child, AdditiveSharingTensor) and isinstance(
self.child, torch.Tensor
):
# If we try to matmul a FPT>torch.tensor with a FPT>(wrap)>AST,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LaRiffle removed the wrap

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great!

@gmuraru gmuraru requested a review from LaRiffle March 8, 2020 18:00
Copy link
Contributor

@LaRiffle LaRiffle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All good!

@LaRiffle LaRiffle merged commit 3090449 into OpenMined:master Mar 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants