Breast Cancer is one of the most deadly cancers affecting middle-aged women. Accurate diagnosis and prognosis
are crucial to reduce the high death rate. Nowadays there are numerous diagnostic tools for breast cancer
diagnosis. In this paper we discuss a role of nuclear segmentation from fine needle aspiration biopsy (FNA) slides
and its influence on malignancy classification. Classification of malignancy plays a very important role during
the diagnosis process of breast cancer. Out of all cancer diagnostic tools, FNA slides provide the most valuable
information about the cancer malignancy grade which helps to choose an appropriate treatment. This process
involves assessing numerous nuclear features and therefore precise segmentation of nuclei is very important.
In this work we compare three powerful segmentation approaches and test their impact on the classification of
breast cancer malignancy. The studied approaches involve level set segmentation, fuzzy c-means segmentation
and textural segmentation based on co-occurrence matrix.
Segmented nuclei were used to extract nuclear features for malignancy classification. For classification purposes
four different classifiers were trained and tested with previously extracted features. The compared classifiers
are Multilayer Perceptron (MLP), Self-Organizing Maps (SOM), Principal Component-based Neural Network
(PCA) and Support Vector Machines (SVM). The presented results show that level set segmentation yields the
best results over the three compared approaches and leads to a good feature extraction with a lowest average
error rate of 6.51% over four different classifiers. The best performance was recorded for multilayer perceptron
with an error rate of 3.07% using fuzzy c-means segmentation.
This paper discusses the possibility of exploiting of the tennovision registration and artificial neural networks for facial recognition systems. A biometric system that is able to identify people from thermograms is presented. To identify a person we used the Eigenfaces algorithm. For the face detection in the picture the backpropagation neural network was designed. For this purpose thermograms of 10 people in various external conditions were studies. The Eigenfaces
algorithm calculated an average face and then the set of characteristic features for each studied person was produced.
The neural network has to detect the face in the image before it actually can be identified. We used five hidden layers for that purpose. It was shown that the errors in recognition depend on the feature extraction, for low quality pictures the error was so
high as 30%. However, for pictures with a good feature extraction the results of proper identification higher then 90%, were obtained.
In this paper we present a parallel code which performs in iterative image deconvolution using either a spatially- invariant point spread function (SI-PSF) or a spatially- variant point spread function (SV-PSF). The basic algorithm is described as well as a description of the parallel implementation. Applications and results in the area of medical x-ray imaging is discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.