[−][src]Function deep_learning_playground::neural_network::activate_functions::rectified_linear_unit
pub fn rectified_linear_unit<T: Float>() -> Box<dyn Fn(Array2<T>) -> Array2<T>>
rectified_linear_unit
generates the Rectufied Linear Unit (ReLU) function
\[
\text{ReLU}(x)=\begin{cases}
x & (x\gt 0) \\
0 & (\text{otherwise})
\end{cases}
\] for Array2
where \(x\in\mathbb{R}\).