[−][src]Function deep_learning_playground::perceptron::single::logical_perceptron
pub fn logical_perceptron<T: Float>(
w1: &'static T,
w2: &'static T,
bias: &'static T
) -> Box<dyn Fn(bool, bool) -> bool>
logical_perceptron
is the logical perceptron.
It can generates some logical gates (and_perceptron
, nand_perceptron
, or_perceptron
)
and generate a function that passes the sum of weighted signals
(\(b+X\cdot W\) where \(b\in\mathbb{R}\) is a bias
value, \(W^{2\times 1}=\left(w_1,w_2\right)^T, X^{1\times 2}=\left(x_1,x_2\right)\) and
\(w_1\) and \(w_2\) are weights of \(x_1\) and \(x_2\)) to the activation function.
Arguments
w1
- Weights for signalx1
w2
- Weights for signalx2
bias
- Bias that determines the ease of firing of neurons
e.g.
assert_eq!(true, deep_learning_playground::perceptron::single::logical_perceptron(&0.5, &0.5, &-0.7)(true, true)); assert_eq!(false, deep_learning_playground::perceptron::single::logical_perceptron(&-0.5, &-0.5, &0.7)(true, true)); assert_eq!(true, deep_learning_playground::perceptron::single::logical_perceptron(&0.5, &0.5, &-0.2)(true, false));