[][src]Module deep_learning_playground::perceptron::single

Functions

and_perceptron

and_perceptron generates the logical AND function. Let \(p\) and \(q\) are logical variables, and_perceptron generates the function which satisfies the following truth table.

logical_perceptron

logical_perceptron is the logical perceptron. It can generates some logical gates (and_perceptron, nand_perceptron, or_perceptron) and generate a function that passes the sum of weighted signals (\(b+X\cdot W\) where \(b\in\mathbb{R}\) is a bias value, \(W^{2\times 1}=\left(w_1,w_2\right)^T, X^{1\times 2}=\left(x_1,x_2\right)\) and \(w_1\) and \(w_2\) are weights of \(x_1\) and \(x_2\)) to the activation function.

nand_perceptron

nand_perceptron generates the logical NAND function. Let \(p\) and \(q\) are logical variables, nand_perceptron generates the function which satisfies the following truth table.

or_perceptron

or_perceptron generates the logical OR function. Let \(p\) and \(q\) are logical variables, or_perceptron()(\(p\),\(q\)) generates the function which satisfies the following truth table.

step_function

step_function is the kind of activation functions. That is \[ \text{step}(x) = \begin{cases} 1 & (x\gt 0) \\ 0 & (\text{otherwise}) \end{cases} \] where \(x\in\mathbb{R}\).