Chebyshev or Sigmoid for tanh#3113
Conversation
| return sy.MultiPointerTensor(children=pointers) | ||
|
|
||
| def zero(self): | ||
| def zero(self, shape=None): |
There was a problem hiding this comment.
Needed this in case I have a matrix multiplication - in that case the dimension for the zero shares might be different.
| alice, bob, james = workers["alice"], workers["bob"], workers["james"] | ||
|
|
||
| fix_prec_tolerance_by_method = { | ||
| "chebyshev": {3: 3 / 100, 4: 3 / 100, 5: 3 / 100}, |
There was a problem hiding this comment.
Those might need more tweaking - currently, it works, but we might want a lower bound
LaRiffle
left a comment
There was a problem hiding this comment.
Small things around the tensor chain and the use of extra wrappers, and then we're good :)
| crypto_provider=self.crypto_provider, | ||
| **no_wrap, | ||
| ).child | ||
| elif isinstance(other, AdditiveSharingTensor): |
There was a problem hiding this comment.
This is not supposed to happen: the decorator makes sure that the AST other is replaced by his .child
So if you still have an AST it means that you've operated something like [syfttype]>AST>... with the self AST
What is the usecase for this?
|
|
||
| for worker, share in it: | ||
| cmd_res = cmd(share, other) | ||
| res[worker] = (cmd(share, other) + zero_shares[worker]) % self.field |
There was a problem hiding this comment.
Can you put all in the same loop (and add a flag to notice when it's the first iteration) ?
|
|
||
| if isinstance(self.child, AdditiveSharingTensor) and isinstance(other.child, torch.Tensor): | ||
| # If we try to matmul a FPT>(wrap)>AST with a FPT>torch.tensor, | ||
| # If we try to matmul a FPT>(wrap)>AST with a FPT>(wrap)>torch.tensor, |
There was a problem hiding this comment.
Nope, FPT>(wrap)>torch.tensor is forbidden!
But actually FPT>(wrap)>AST is not correct anymore, now we do FPT>AST directly
| self.child, torch.Tensor | ||
| ): | ||
| # If we try to matmul a FPT>torch.tensor with a FPT>(wrap)>AST, | ||
| # If we try to matmul a FPT>(wrap)>torch.tensor with a FPT>(wrap)>AST, |
| alice, bob, james = workers["alice"], workers["bob"], workers["james"] | ||
|
|
||
| fix_prec_tolerance_by_method = { | ||
| "chebyshev": {3: 3 / 100, 4: 3 / 100, 5: 3 / 100}, |
| elif isinstance(other.child, AdditiveSharingTensor) and isinstance( | ||
| self.child, torch.Tensor | ||
| ): | ||
| # If we try to matmul a FPT>torch.tensor with a FPT>(wrap)>AST, |
Added the reference for the paper where the polynomial approximations are discussed.
Here is the paper (it is also added as a comment)
They talk about more functions and how to approximate those - maybe it is worth to have approximations for more functions?
Added tests and the possibility to choose between Chebyshev or Sigmoid for computing the hyperbolic tangent.