Age and gender classification are two important problems that recently gained popularity in the research community, due to their wide range of applications. Research has shown that both age and gender information are encoded in the face shape and texture, hence the active appearance model (AAM), a statistical model that captures shape and texture variations, has been one of the most widely used feature extraction techniques for the aforementioned problems. However, AAM suffers from some drawbacks, especially when used for classification. This is primarily because principal component analysis (PCA), which is at the core of the model, works in an unsupervised manner, i.e., PCA dimensionality reduction does not take into account how the predictor variables relate to the response (class labels). Rather, it explores only the underlying structure of the predictor variables, thus, it is no surprise if PCA discards valuable parts of the data that represent discriminatory features. Toward this end, we propose a supervised appearance model (sAM) that improves on AAM by replacing PCA with partial least-squares regression. This feature extraction technique is then used for the problems of age and gender classification. Our experiments show that sAM has better predictive power than the conventional AAM.
In most gait recognition techniques, both static and dynamic features are used to define a subject’s gait signature. In this study, the existence of a relationship between static and dynamic features was investigated. The correlation coefficient was used to analyse the relationship between the features extracted from the “University of Bradford Multi-Modal Gait Database”. This study includes two dimensional dynamic and static features from 19 subjects. The dynamic features were compromised of Phase-Weighted Magnitudes driven by a Fourier Transform of the temporal rotational data of a subject’s joints (knee, thigh, shoulder, and elbow). The results concluded that there are eleven pairs of features that are considered significantly correlated with (p<0.05). This result indicates the existence of a statistical relationship between static and dynamics features, which challenges the results of several similar studies. These results bare great potential for further research into the area, and would potentially contribute to the creation of a gait signature using latent data.
The classical approach to converting colour to greyscale is to code the luminance signal as a grey value image.
However, the problem with this approach is that the detail at equiluminant edges vanishes, and in the worst case
the greyscale reproduction of an equiluminant image is a single uniform grey value. The solution to this problem,
adopted by all algorithms in the field, is to try to code colour difference (or contrast) in the greyscale image. In this
paper we reconsider the Socolinsky and Wolff algorithm for colour to greyscale conversion. This algorithm, which
is the most mathematically elegant, often scores well in preference experiments but can introduce artefacts which
spoil the appearance of the final image. These artefacts are intrinsic to the method and stem from the underlying
approach which computes a greyscale image by a) calculating approximate luminance-type derivatives for the
colour image and b) re-integrating these to obtain a greyscale image. Unfortunately, the sign of the derivative
vector is sometimes unknown on an equiluminant edge and, in the current theory, is set arbitrarily. However,
choosing the wrong sign can lead to unnatural contrast gradients (not apparent in the colour original). Our
contribution is to show how this sign problem can be ameliorated using a generalised definition of luminance and
a Markov relaxation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.