Date of Award

3-1-2019

Document Type

Thesis

Degree Name

Bachelor of Science (B.S.) in Electrical Engineering and University Honors

Department

Electrical and Computer Engineering

First Advisor

Dan Hammerstrom

Subjects

Dendrites, Transfer functions, Neural networks, Machine learning

DOI

10.15760/honors.719

Abstract

Dendritic branch operations in pyramidal neurons are well understood in-vivo but their potential as computational assets in deep neural networks has not been explored. The pre-processing which dendrites perform may be able to decrease the error of an artificial neuron because each dendrite serves as an independent filtering mechanism which may prevent false positives. In order to test this hypothesis, a fully-connected layer implementing the dendritic transfer function is defined and used to replace the final fully-connected layer used in a standard CNN (convolutional neural network). Results show that the defined algorithm is not able to predict better than chance and possible causes are discussed. A framework for developing future dendritic layers is established.

Persistent Identifier

https://archives.pdx.edu/ds/psu/28835

Share

COinS