Classicly, Relu computes the following on input:
Relu(x) = max(0, x)
In tf-encrypted, how Relu behaves will depend on the underlying protocol you are using.
SecureNN, Relu will behave as you expect (Relu(x) = max(0, x))
backward is not implemented for Relu
Parameters: x (PondTensor) – The input tensor Return type: PondTensor Returns: A pond tensor with the same backing type as the input tensor.
get_output_shape() → List[int]¶
Returns the layer’s output shape