Published In
International Joint Conference on Neural Networks
Document Type
Post-Print
Publication Date
7-2001
Subjects
Neural networks -- Structure, Pattern recognition, System theory
Abstract
Absence of a priori knowledge about a problem domain typically forces use of overly complex neural network structures. An information-theoretic method based on calculating information transmission is applied to training data to obtain a priori knowledge that is useful for prestructuring (reducing complexity) of neural networks. The method is applied to a continuous system, and it is shown that such prestructuring reduces training time, and enhances generalization capability.
Rights
This is the post-print version (author manuscript). The final published version is available from the publisher, © Copyright IEEE:
https://doi.org/10.1109/IJCNN.2001.939047
DOI
10.1109/IJCNN.2001.939047
Persistent Identifier
https://archives.pdx.edu/ds/psu/36258
Citation Details
B. Chambless, G. G. Lendaris and M. Zwick, "An information theoretic methodology for prestructuring neural networks," [Post-print] IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222), 2001, pp. 365-370 vol.1, doi: 10.1109/IJCNN.2001.939047.