Lab 3: Classification (cont.)

Multi Layer Perceptrons

Multi Layer Perceptrons are the usual representation used by neural networks for dealing with multidimensional data. In Python they are implemented through the MLPClassifier in the sklearn.neural_network.

Its training is done through an implementation of the backpropagation algorithm, and so it makes use of a learning rate specified through the learning_rate_init parameter, a number of maximum iterations (max_iter parameter) and the kind of learning rate change through the learning_rate parameter. trains a set of n decision trees, that are combined in an ensemble of n_estimators. Beside those, we also may choose among different activation functions through the activation parameter here instantiated as the logistic function. Setting the verbose parameter we are able to see the error evolution along iterations.

After the plot you can see the parameters for which the best results were achieved. So let's see its performance, in that context in terms of other metrics.