![]() | Only 14 pages are availabe for public view |
Abstract Among them, the class of multilayer feedforward networks is perhaps the most popular. Methods using standard backprop perform gradient descent only in the weight space of a network with fixed topology. In general, this approach is useful only when the network architecture is chosen correctly. Too small a network may not be able to capture the characteristics of the training samples, but too large a size will lead to overfitting and poor generalization performance. With increasing of available data, the classification of large data volumes needs more depth in the structure of neural networks. These special structure of Feedforward Neural Networks (FNN) are called Deep Neural Networks (DNN) and that have nowadays a great success in several applications such as biological data, breast cancer diagnosis and medical imaging, rotating machinery diagnosis of real-life applications such as pedestrian detection. |