Presentation Type

Poster

Start Date

5-8-2024 11:00 AM

End Date

5-8-2024 1:00 PM

Subjects

Mechanical engineering, Neural networks -- Research and applications

Advisor

Alex Hunt

Student Level

Doctoral

Abstract

As neural networks have become prolific solutions to modern problems, there has been a congruent rise in the popularity of the numerical machine learning techniques used to design them. While they are highly generalizable, numerical methods tend to produce networks that act as inscrutable “black boxes,” making it difficult to interpret their behavior. One solution to the problem of network transparency is to use analytical techniques, but these methods are underdeveloped compared to their numerical alternatives. In order to enhance the viability of analytical techniques, this work extends previous efforts to quantify the impact that non-spiking neural encoding schemes have on the approximation quality of the arithmetic subnetworks of the functional subnetwork approach (FSA). In particular, novel design constraints are derived for arithmetic operations using two different encoding strategies: (1) an “absolute” scheme in which numerical values are represented directly by neuronal membrane voltages, and (2) a “relative” scheme wherein values are represented by the percent activation of these neurons instead. Numerical simulations indicate that both qualitative and quantitative advantages to the relative scheme, including increased approximation accuracy of ~5% for normal operational ranges, greater numerical conditioning, and the freedom to choose more biologically realistic subnetwork parameters. Each of these metrics is found to depend on the operation of interest and their associated gain parameters. Future work will extend the study of non-spiking neural encoding schemes beyond arithmetic operations to the more sophisticated calculus subnetworks of the FSA.

Keywords: Neural Encoding Schemes, Analytical Neural Network Design Techniques, Functional Subnetwork Approach

Creative Commons License or Rights Statement

Creative Commons Attribution 4.0 License
This work is licensed under a Creative Commons Attribution 4.0 License.

Persistent Identifier

https://archives.pdx.edu/ds/psu/41909

Share

COinS
 
May 8th, 11:00 AM May 8th, 1:00 PM

Deriving Analytical Design Constraints For Absolute & Relative Encoding Schemes In Functional Subnetworks

As neural networks have become prolific solutions to modern problems, there has been a congruent rise in the popularity of the numerical machine learning techniques used to design them. While they are highly generalizable, numerical methods tend to produce networks that act as inscrutable “black boxes,” making it difficult to interpret their behavior. One solution to the problem of network transparency is to use analytical techniques, but these methods are underdeveloped compared to their numerical alternatives. In order to enhance the viability of analytical techniques, this work extends previous efforts to quantify the impact that non-spiking neural encoding schemes have on the approximation quality of the arithmetic subnetworks of the functional subnetwork approach (FSA). In particular, novel design constraints are derived for arithmetic operations using two different encoding strategies: (1) an “absolute” scheme in which numerical values are represented directly by neuronal membrane voltages, and (2) a “relative” scheme wherein values are represented by the percent activation of these neurons instead. Numerical simulations indicate that both qualitative and quantitative advantages to the relative scheme, including increased approximation accuracy of ~5% for normal operational ranges, greater numerical conditioning, and the freedom to choose more biologically realistic subnetwork parameters. Each of these metrics is found to depend on the operation of interest and their associated gain parameters. Future work will extend the study of non-spiking neural encoding schemes beyond arithmetic operations to the more sophisticated calculus subnetworks of the FSA.

Keywords: Neural Encoding Schemes, Analytical Neural Network Design Techniques, Functional Subnetwork Approach