Sponsor
Portland State University. Department of Electrical and Computer Engineering
First Advisor
George G. Lendaris
Term of Graduation
2011
Date of Publication
1-1-2011
Document Type
Dissertation
Degree Name
Doctor of Philosophy (Ph.D.) in Electrical and Computer Engineering
Department
Electrical and Computer Engineering
Language
English
Subjects
Clave, Machine learning, Prestructuring, Samba, Computational intelligence, Musical meter and rhythm, Neural networks (Computer science) -- Methodology, Information theory in music, Perceptrons
DOI
10.15760/etd.384
Physical Description
1 online resource (xxviii, 515 pages)
Abstract
The present study shows that prestructuring based on domain knowledge leads to statistically significant generalization-performance improvement in artificial neural networks (NNs) of the multilayer perceptron (MLP) type, specifically in the case of a noisy real-world problem with numerous interacting variables.
The prestructuring of MLPs based on knowledge of the structure of a problem domain has previously been shown to improve generalization performance. However, the problem domains for those demonstrations suffered from significant shortcomings: 1) They were purely logical problems, and 2) they contained small numbers of variables in comparison to most data-mining applications today. Two implications of the former were a) the underlying structure of the problem was completely known to the network designer by virtue of having been conceived for the problem at hand, and b) noise was not a significant concern in contrast with real-world conditions. As for the size of the problem, neither computational resources nor mathematical modeling techniques were advanced enough to handle complex relationships among more than a few variables until recently, so such problems were left out of the mainstream of prestructuring investigations.
In the present work, domain knowledge is built into the solution through Reconstructability Analysis, a form of information-theoretic modeling, which is used to identify mathematical models that can be transformed into a graphic representation of the problem domain's underlying structure. Employing the latter as a pattern allows the researcher to prestructure the MLP, for instance, by disallowing certain connections in the network. Prestructuring reduces the set of all possible maps (SAPM) that are realizable by the NN. The reduced SAPM--according to the Lendaris-Stanley conjecture, conditional probability, and Occam's razor--enables better generalization performance than with a fully connected MLP that has learned the same I/O mapping to the same extent.
In addition to showing statistically significant improvement over the generalization performance of fully connected networks, the prestructured networks in the present study also compared favorably to both the performance of qualified human agents and the generalization rates in classification through Reconstructability Analysis alone, which serves as the alternative algorithm for comparison.
Rights
In Copyright. URI: http://rightsstatements.org/vocab/InC/1.0/ This Item is protected by copyright and/or related rights. You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).
Persistent Identifier
http://archives.pdx.edu/ds/psu/7132
Recommended Citation
Vurkaç, Mehmet, "Prestructuring Multilayer Perceptrons based on Information-Theoretic Modeling of a Partido-Alto-based Grammar for Afro-Brazilian Music: Enhanced Generalization and Principles of Parsimony, including an Investigation of Statistical Paradigms" (2011). Dissertations and Theses. Paper 384.
https://doi.org/10.15760/etd.384