3.1 XOR-Gate with MLP
Section outline
-
XOR-Gate with Multilayer Perceptron
🎙️Claudio Acuña
The aim of this example is to learn a two-input XOR logic gate using a fully connected multilayer perceptron. The animation shows how the ANN classifies the inputs labeled as -1,1 based on their output -1,1. Under these conditions, the ANN separates the outputs using decision lines. Backpropagation is the learning algorithm that was used. Students studied the mathematics behind gradient descent to minimize the squared error loss function. In addition, the animation depicts the evolution of the "V" weights in each epoch, which are part of the hidden layer. The objective of the training was to achieve an error of less than 0.05.