You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our kernels assume the memory of input tensors is continuous. It seems the torch's autograd may return tensors with non-continuous memory. Need to call the contiguous method to ensure the memory is contiguous.
Our kernels assume the memory of input tensors is continuous. It seems the torch's autograd may return tensors with non-continuous memory. Need to call the
contiguousmethod to ensure the memory is contiguous.