국가지식-학술정보
A comparison of methods to reduce overfitting in neural networks
A comparison of methods to reduce overfitting in neural networks
- 한국인터넷방송통신학회
- International journal of advanced smart convergence
- Vol.9 No.2
-
2020.01173 - 178 (6 pages)
- 0
커버이미지 없음
A common problem with neural network learning is that it is too suitable for the specificity of learning. In this paper, various methods were compared to avoid overfitting: regularization, drop-out, different numbers of data and different types of neural networks. Comparative studies of the above-mentioned methods have been provided to evaluate the test accuracy. I found that the more data using method is better than the regularization and dropout methods. Moreover, we know that deep convolutional neural networks outperform multi-layer neural networks and simple convolution neural networks.
(0)
(0)