Python scikit learn MLPClassifier hidden_layer_sizes
Python scikit learn MLPClassifier hidden_layer_sizes
hidden_layer_sizes=(7,)
if you want only 1 hidden layer with 7 hidden units.
length = n_layers - 2
is because you have 1 input layer and 1 output layer.
In the docs:
hidden_layer_sizes : tuple, length = n_layers – 2, default (100,)
means :
hidden_layer_sizes is a tuple of size (n_layers -2)
n_layers means no of layers we want as per architecture.
Value 2 is subtracted from n_layers because two layers (input & output ) are not part of hidden layers, so not belong to the count.
default(100,) means if no value is provided for hidden_layer_sizes then default architecture will have one input layer, one hidden layer with 100 units and one output layer.
From the docs again:
The ith element represents the number of neurons in the ith hidden layer.
means each entry in tuple belongs to corresponding hidden layer.
Example :
-
For architecture 56:25:11:7:5:3:1 with input 56 and 1 output
hidden layers will be (25:11:7:5:3). So tuplehidden_layer_sizes = (25,11,7,5,3,)
-
For architecture 3:45:2:11:2 with input 3 and 2 output
hidden layers will be (45:2:11). So tuplehidden_layer_sizes = (45,2,11,)