Document Type
Conference Proceeding
Publication Date
1999
Subjects
Machine learning, Self-organizing systems -- Design and construction, Artificial intelligence
Abstract
"Learning Hardware" approach involves creating a computational network based on feedback from the environment (for instance, positive and negative examples from the trainer), and realizing this network in an array of Field Programmable Gate Arrays (FPGAs). Computational networks can be built based on incremental supervised learning (Neural Net training) or global construction (Decision Tree design). Here we advocate the approach to Learning Hardware based on Constructive Induction methods of Machine Learning (ML) using multivalued functions. This is contrasted with the Evolvable Hardware (EHW) approach in which learning/evolution is based on the genetic algorithm only. Various approaches to supervised inductive learning for Data Mining and Machine Learning applications require fast operations on complex logic expressions and solving some NP-complete problems such as graph-coloring or set covering. They should be realized therefore in hardware to obtain the necessary speed-ups. Using a fast prototyping tool; the DEC-PERLE-l board based on an array of Xilinx FPGAs, we are developing virtual processors that accelerate the design and optimization of decomposed networks of arbitrary logic blocks.
Persistent Identifier
http://archives.pdx.edu/ds/psu/12785
Citation Details
Perkowski, Marek, Stanislaw Grygiel, Qihong Chen, and Dave Mattson. "Constructive induction machines for data mining." (1999)
Description
Originally presented at the International Symposium on Future of Intellectual Integrated Electronics, held at Tohoku University in 1999, and subsequently included in its proceedings.