Deep Learning¶
Tutorial¶
GitHub¶
Web Page¶
Troubleshooting and Debugging¶
Hyperparameters¶
Optimizer¶
Batch Normalization¶
- Ordering of batch normalization and dropout? - stackoverflow
- 順番はBN → Dropoutにする(Dropout → BNだと学習と推論でBNに入力する値の分散が変わってしまい性能が悪化する)
MLP (Multilayer Perceptron)¶
How to Configure the Number of Layers and Nodes in a Neural Network - Jason Brownlee
-
“One hidden layer is sufficient for the large majority of problems.”“the optimal size of the hidden layer is usually between the size of the input and size of the output layers.”“The number of hidden neurons should be 2/3 the size of the input layer, plus the size of the output layer.”“The number of hidden neurons should be less than twice the size of the input layer.”“the number of neurons in that layer is the mean of the neurons in the input and output layers.”
Number of nodes in hidden layers of neural network - Cross Validated