Paper
22 March 1996 Collapsing multiple hidden layers in feedforward neural networks to a single hidden layer
Jeffrey L. Blue, Lawrence O. Hall
Author Affiliations +
Abstract
Feed forward neural networks are often configured with multiple hidden layers. The relative simplicity of a single hidden layer may allow the use of a broader number of algorithms than could be used with multiple hidden layers. The hardware mapping may also be simplified with all networks implemented in a single hidden layer. An algorithm is presented which will collapse a network to a single hidden layer. The algorithm replaces links between hidden layers with new units whose links bypass hidden layers. These new units have link weights calculated so that the new unit approximates the replaced link's contribution to the network. Trials with a variety of configurations and data sets demonstrate the validity and effectiveness of the concept.
© (1996) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jeffrey L. Blue and Lawrence O. Hall "Collapsing multiple hidden layers in feedforward neural networks to a single hidden layer", Proc. SPIE 2760, Applications and Science of Artificial Neural Networks II, (22 March 1996); https://doi.org/10.1117/12.235964
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

IRIS Consortium

Binary data

Algorithm development

Iris

Computer engineering

Computer hardware

RELATED CONTENT


Back to Top