Off-campus UNL users: To download campus access dissertations, please use the following link to log into our proxy server with your NU ID and password. When you are done browsing please remember to return to this page and log out.

Non-UNL users: Please talk to your librarian about requesting this dissertation through interlibrary loan.

Development of optimal network structures for back-propagation-trained neural networks

Qing Guan, University of Nebraska - Lincoln

Abstract

A critical question in the neural network research today concerns how many hidden neurons to use. There is no magic formula because it seems to be largely dependent upon the complexity of the problem being solved. The potential performance impact of hidden layers and neurons must be taken into consideration in the network development process. This study focuses mainly on how to develop an optimal neural network model for a specific task. In other words, for a given task, it is desired to find a neural network structure which has a minimal number of layers, minimal number of units in each layer, and good generalization ability. A process to build an optimal network structure is proposed in this study. The core of this process is the direct weight pruning method. This method is based on mathematical deduction and the property of the dominant subnet of a network that is trained by a back-propagation algorithm with normalized input data. The smallest magnitude weight in the trained network is pruned sequentially. After no further pruning is possible, the isolated units of the network are deleted thus simplifying the original trained network. The proposed process is evaluated using two common benchmark problems: XOR and Parity. It is demonstrated that the new pruning method produces the optimal network models while being both simple and efficient. The process is also evaluated using a real-world application problem: firm bankruptcy prediction. The performance of the neural network is compared to that of multivariate discriminant analysis models for matched bankruptcy samples. The neural network structure produced by the proposed process offers a superior modeling approach for firm bankruptcy prediction.

Subject Area

Management|Computer science|Artificial intelligence

Recommended Citation

Guan, Qing, "Development of optimal network structures for back-propagation-trained neural networks" (1993). ETD collection for University of Nebraska-Lincoln. AAI9333966.
https://digitalcommons.unl.edu/dissertations/AAI9333966

Share

COinS