Published In

International Joint Conference on Neural Networks

Document Type

Post-Print

Publication Date

7-2001

Subjects

Neural networks -- Structure, Pattern recognition, System theory

Abstract

Absence of a priori knowledge about a problem domain typically forces use of overly complex neural network structures. An information-theoretic method based on calculating information transmission is applied to training data to obtain a priori knowledge that is useful for prestructuring (reducing complexity) of neural networks. The method is applied to a continuous system, and it is shown that such prestructuring reduces training time, and enhances generalization capability.

Rights

This is the post-print version (author manuscript). The final published version is available from the publisher, © Copyright IEEE:
https://doi.org/10.1109/IJCNN.2001.939047

DOI

10.1109/IJCNN.2001.939047

Persistent Identifier

https://archives.pdx.edu/ds/psu/36258

Share

COinS