Wavelet theory provides an attractive approach to signal and image compression. This work investigates a new wavelet transform coefficient selection approach for efficient image compression. For a desired image compression ratio (50:1), wavelet scale thresholds are derived via a multiagent stochastic optimization process. Previous work has demonstrated an interscale relationship between the stochastically optimized wavelet coefficient thresholds. Based on the experimental results, a deterministic wavelet coefficient selection criteria is hypothesized and the constants of the equation are statistically derived. Experimental results of the stochastic optimization and deterministic approaches are compared and contrasted with results from previously published wavelet coefficient threshold strategies.
Finite mixture models (mixture models) estimate probability density functions based on a weighted combination of density functions. This work investigates a combined stochastic and deterministic optimization approach of a generalized kernel function for multivariate mixture density estimation. Mixture models are selected and optimized by combining the optimization characteristics of a multi-agent stochastic optimization algorithm, based on evolutionary programming, and the EM algorithm. A classification problem is approached by optimizing a mixture density estimate for each class. Rissanen's minimum description length criterion provides the selection mechanism for evaluating mixture models. A comparison of each class' posterior probability (Bayes rule) provides the classification decision procedure. A 2-D, two- class classification problem is posed, and classification performance of the optimal mixture models is compared with a kernel estimator whose bandwidth is optimized using the technique of least-squares cross-validation.
This work investigates the application of evolutionary search to cascade-correlation learning architectures. Evolutionary programming is used to generate the hidden weights of each candidate hidden unit in the cascade-correlation learning paradigm. The output weights are adapted using deterministic techniques. Evolutionary search is also used to modify the connectivity of each candidate unit so that parsimonious structures may be generated during the neural network construction process. This approach is appealing from a computational perspective since only a population of hidden nodes is being optimized as opposed to a population of neural networks. Results are given for selected low-dimensional examples.
This work investigates the application of evolutionary programming for automatically configuring neural network architectures for pattern classification tasks. The evolutionary programming search procedure implements a parallel nonlinear regression technique and represents a powerful method for evaluating a multitude of neural network model hypotheses. The evolutionary programming search is augmented with the Solis & Wets random optimization method thereby maintaining the integrity of the stochastic search while taking into account empirical information about the response surface. A network architecture is proposed which is motivated by the structures generated in projection pursuit regression and the cascade-correlation learning architecture. Results are given for the 3-bit parity, normally distributed data, and the T-C classifier problems.
This woit investigates the application of evolutionary programming, a multi-agent stochastic search technique, to the generation of recurrent perceptrons (nonlinear hR filters) for time-series prediction tasks. The evolutionary programming paradigm is discussed and analogies are made to classical stochastic optimization methods. A hybrid optimization scheme is proposed based on multi-agent and single-agent random optimization techniques. This method is then used to determine both the model order and weight coefficients of linear, nonlinear, and parallel linear-nonlinear nextstep predictors. The AIC is used as the cost function to score each candidate solution.
This work investigates the application of a stochastic search technique, evolutionary programming, for developing self-organizing neural networks. The chosen stochastic search method is capable of simultaneously evolving both network architecture and weights. The number of synapses and neurons are incorporated into an objective function so that network parameter optimization is done with respect to computational costs as well as mean pattern error. Experiments are conducted using feedforward networks for simple binary mapping problems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.