ReLU-based

modrelu(z: Tensor, b: float, c: float = 1e-3)

mod ReLU presented in [CIT2016-ARJOVSKY].

A variation of the ReLU named modReLU. It is a pointwise nonlinearity, \(modReLU(z) : C \longrightarrow C\), which affects only the absolute value of a complex number, defined

\[modReLU(z) = ReLU(|z|+b)*z/|z|\]
crelu(z: Tensor)

Mirror of cvnn.activations.cart_relu. Applies Rectified Linear Unit to both the real and imag part of z.

The relu function, with default values, it returns element-wise max(x, 0).

Otherwise, it follows.

\[\begin{split}f(x) = \textrm{max_value}, \quad \textrm{for} \quad x >= \textrm{max_value} \\ f(x) = x, \quad \textrm{for} \quad \textrm{threshold} <= x < \textrm{max_value} \\ f(x) = \alpha * (x - \textrm{threshold}), \quad \textrm{otherwise} \\\end{split}\]
Parameters:z – Input tensor.
Returns:Tensor result of the applied activation function
zrelu(z: Tensor)

zReLU presented in [CIT2016-GUBERMAN]. This methods let’s the output as the input if both real and imaginary parts are positive.

\[\begin{split}f(z) = z \quad \textrm{for} \quad 0 \leq \phi_z \leq \pi / 2 \\ f(z) = 0 \quad \textrm{elsewhere} \\\end{split}\]
complex_cardioid(z: Tensor)

Complex cardioid presented in [CIT2017-PATRICK]

This function maintains the phase information while attenuating the magnitude based on the phase itself. For real-valued inputs, it reduces to the ReLU.

\[f(z) = \frac{(1 + cos \phi_z) * z}{2}\]
[CIT2016-ARJOVSKY]
  1. Arjovsky et al. “Unitary Evolution Recurrent Neural Networks” 2016
[CIT2016-GUBERMAN]
  1. Guberman “On Complex Valued Convolutional Neural Networks” 2016
[CIT2017-PATRICK]
  1. Patrick et al. “Better than Real: Complex-valued Neural Nets for MRI Fingerprinting” 2017