The widespread use of multisensor technology and the emergence of big datasets have created the need to develop tools to reduce, approximate, and classify large and multimodal data such as higher-order tensors. While early approaches focused on matrix- and vector-based methods to represent these higher-order data, more recently it has been shown that tensor decomposition methods are better equipped to capture couplings across their different modes. For these reasons, tensor decomposition methods have found applications in many different signal processing problems including dimensionality reduction, signal separation, linear regression, feature extraction, and classification. However, most of the existing tensor decomposition methods are based on the principle of finding a low-rank approximation in a linear subspace structure, where the definition of rank may change depending on the particular decomposition. Since many datasets are not necessarily low-rank in a linear subspace, this often results in high approximation errors or low compression rates. In this paper, we introduce a new adaptive, multi-scale tensor decomposition method for higher-order data inspired by hybrid linear modeling and subspace clustering techniques. In particular, we develop a multi-scale higher-order singular value decomposition (MS-HoSVD) approach where a given tensor is first permuted and then partitioned into several sub-tensors each of which can be represented as a low-rank tensor with increased representational efficiency. The proposed approach is evaluated for dimensionality reduction and classification for several different real-life tensor signals with promising results.
Human brain is a complex network with connections across different regions. Understanding the functional connectivity (FC) of the brain is important both during resting state and task; as disruptions in connectivity patterns are indicators of different psychopathological and neurological diseases. In this work, we study the resting state functional connectivity networks (FCNs) of the brain from fMRI BOLD signals. Recent studies have shown that FCNs are dynamic even during resting state and understanding the temporal dynamics of FCNs is important for differentiating between different conditions. Therefore, it is important to develop algorithms to track the dynamic formation and dissociation of FCNs of the brain during resting state. In this paper, we propose a two step tensor based community detection algorithm to identify and track the brain network community structure across time. First, we introduce an information-theoretic function to reduce the dynamic FCN and identify the time points that are similar topologically to combine them into a tensor. These time points will be used to identify the different FC states. Second, a tensor based spectral clustering approach is developed to identify the community structure of the constructed tensors. The proposed algorithm applies Tucker decomposition to the constructed tensors and extract the orthogonal factor matrices along the connectivity mode to determine the common subspace within each FC state. The detected community structure is summarized and described as FC states. The results illustrate the dynamic structure of resting state networks (RSNs), including the default mode network, somatomotor network, subcortical network and visual network.
The quantification of synchrony is important for the study of
large-scale interactions in the brain. Current synchrony measures
depend on the energy of the signals rather than the phase, and
cannot be reliably used as measures of neural synchrony. Moreover,
the current methods are insufficient since they are limited to
pairs of signals. These approaches cannot quantify the synchrony
across a group of electrodes and over time-varying frequency
regions. In this paper, we propose two new measures for
quantifying the synchrony between both pairs and groups of
electrodes using time-frequency analysis. The proposed measures
are applied to electroencephalogram (EEG) data to quantify neural
synchrony.
Microaneurysms (MAs) detection is a critical step in diabetic retinopathy screening, since MAs are the earliest
visible warning of potential future problems. A variety of algorithms have been proposed for MAs detection
in mass screening. Different methods have been proposed for MAs detection. The core technology for most of
existing methods is based on a directional mathematical morphological operation called "Top-Hat" filter that
requires multiple filtering operations at each pixel. Background structure, uneven illumination and noise often
cause confusion between MAs and some non-MA structures and limits the applicability of the filter. In this paper,
a novel detection framework based on edge directed inference is proposed for MAs detection. The candidate MA
regions are first delineated from the edge map of a fundus image. Features measuring shape, brightness and
contrast are extracted for each candidate MA region to better exclude false detection from true MAs. Algorithmic
analysis and empirical evaluation reveal that the proposed edge directed inference outperforms the "Top-Hat"
based algorithm in both detection accuracy and computational speed.
In array processing applications, it is desirable to extract the sources that generate the observed signals. There are various source separation and component extraction algorithms in literature including principal component analysis (PCA) and independent component analysis (ICA). However, most of these methods are not designed to deal with time-varying signals and thus are formulated in the time domain. In this paper, we introduce a new time-frequency based decomposition method using an information measure as the decomposition criteria. It is shown that under the assumption of disjoint source signals on the time-frequency plane, this method can extract the sources up to a scalar factor. Based on the QR decomposition of the mixing matrix, the source extraction algorithm is reduced to finding the optimal N-dimensional rotation of the observed time-frequency distributions. The proposed algorithm is implemented using the steepest descent approach to find the optimal rotation angle. The performance of the method is illustrated for example signals and compared to some well-known decomposition techniques.
Information processing theory aims to quantify how well signals
encode information and how well systems process information.
Time-frequency distributions have been used to represent the
energy distribution of time-varying signals for the past twenty
years. There has been a lot of research on various properties of
these representations. However, there is a general lack of
quantitative analysis in describing the amount of information
encoded into a time-frequency distribution. This paper aims to
quantify how well time-frequency distributions represent
information by using information-theoretic distance measures.
Different distance measures, such as Kullback-Leibler distance,
R\'{e}nyi distance, will be adapted to the time-frequency plane.
Their performance in quantifying the information in a given signal
will be compared. A sensitivity analysis for different distance
measures will be carried out to assess their robustness under
perturbation. Different example signals will be considered for
illustrating the information processing in time-frequency
distributions.
Fingerprint verification has been deployed in a variety of
security applications. Traditional minutiae detection based
verification algorithms do not utilize the rich discriminatory
texture structure of fingerprint images. Furthermore, minutiae
detection requires substantial improvement of image quality and is
thus error-prone. In this paper, we propose an algorithm for
fingerprint verification using the statistics of subbands from
wavelet analysis. One important feature for each frequency subband
is the distribution of the wavelet coefficients, which can be
modeled with a Generalized Gaussian Density (GGD) function. A
fingerprint verification algorithm that combines the GGD
parameters from different subbands is proposed to match two
fingerprints. The verification algorithm in this paper is tested
on a set of 1,200 fingerprint images. Experimental results
indicate that wavelet analysis provides useful features for the
task of fingerprint verification.
A comprehensive theory for time-frequency based signal detection has been developed during the past decade. The time-frequency detectors proposed in literature are linear structures operating on the time-frequency representation of the signals and are equivalent to quadratic receivers that are defined in the time domain. In this paper, an information theoretic approach for signal detection on the time-frequency plane is introduced. In recent years, Renyi entropy has been proposed as an effective measure for quantifying signal complexity on the time-frequency plane and some important properties of this measure have been proven. In this paper, a new approach that uses the entropy functional as the test statistic for signal detection is developed. The minimum error detection algorithm is derived and the performance of this new signal detection method is demonstrated through examples.
KEYWORDS: Digital watermarking, Image quality, Image filtering, Digital imaging, Image processing, Digital filtering, Digital image processing, Image restoration, Data communications, Multimedia
With the rapid development of wireless communication systems, transmission of digital multimedia has become widely spread. This brings with itself the issue of copyright protection for digital work. Digital watermarking is the process of embedding data inside a host image such that it does not degrade the perceptual quality of the image. In recent years, there have been many approaches to introduce different watermarking algorithms for the purposes of copyright protection, broadcast monitoring and covert communication. In this paper, a new transform based watermarking algorithm for digital images is introduced. This method uses the Singular Value Decomposition to obtain the eigenimages of a given image which are known to be the best orthogonal basis that can express that image in a least squares sense. The watermark is embedded by changing the strength of the singular values. The strength of the watermark is determined based on the entropy of the eigenimages to ensure robustness and imperceptibility at the simultaneously. A corresponding watermark detection and extraction algorithm is proposed. The performance of the algorithm under different types of attacks that a digital image can go through during transmission is illustrated through an example.
KEYWORDS: Time-frequency analysis, Denoising, Interference (communication), Signal to noise ratio, Wavelets, Smoothing, Probability theory, Image information entropy, Fourier transforms, Visualization
Signals used in time-frequency analysis are usually corrupted by noise. Therefore, denoising the time-frequency representation is a necessity for producing readable time-frequency images. Denoising is defined as the operation of smoothing a noisy signal or image for producing a noise free representation. Linear smoothing of time-frequency distributions (TFDs) suppresses noise at the expense of considerable smearing of the signal components. For this reason, nonlinear denoising has been preferred. A common example to nonlinear denoising methods is the wavelet thresholding. In this paper, we introduce an entropy based approach to denoising time-frequency distributions. This new approach uses the spectrogram decomposition of time-frequency kernels proposed by Cunningham and Williams.In order to denoise the time-frequency distribution, we combine those spectrograms with smallest entropy values, thus ensuring that each spectrogram is well concentrated on the time-frequency plane and contains as little noise as possible. Renyi entropy is used as the measure to quantify the complexity of each spectrogram. The threshold for the number of spectrograms to combine is chosen adaptively based on the tradeoff between entropy and variance. The denoised time-frequency distributions for several signals are shown to demonstrate the effectiveness of the method. The improvement in performance is quantitatively evaluated.
KEYWORDS: Time-frequency analysis, Wavelets, Fourier transforms, Electrical engineering, Computer science, Signal analyzers, Signal processing, Space operations, Matrices, Lead
Previous work has shown that time-frequency distributions (TFDs) belonging to Cohen's class can be represented as a sum of weighted spectrograms. This representation offers the means of reducing the computational complexity of TFDs. The windows in the spectrogram representation may either be the eigenfunctions obtained from an eigen decomposition of the kernel or any complete set of orthonormal basis functions. The efficiency of the computation can further be increased by using a set of scaled and shifted functions like wavelets. In this paper, the concept of scaling is considered in discrete-time domain. The scale operator in the frequency domain is formulated and the vectors which correspond to the solutions of this eigenvalue problem in discrete-time are derived. These new eigenvectors are very similar in structure to the eigenvectors obtained from eigensystem decomposition of reduced interference distribution (RID) kernels. The relationship between these two sets of window functions is illustrated and a new efficient way of decomposing time-frequency kernels is introduced. The results are compared to the previous decomposition methods. Finally, some possible applications of these discrete scale functions in obtaining new time-frequency distributions are discussed.
This paper outlines means of using special sets of orthonormally related windows to realize Cohen's class of time-frequency distributions (TFDs). This is accomplished by decomposing the kernel of the distribution in terms of the set of analysis windows to obtain short time Fourier transforms (STFTs). The STFTs obtained using these analysis windows are used to form spectrograms which are then linearly combined with proper weights to form the desired TFD. A set of orthogonal analysis windows which also have the scaling property proves to be very effective, requiring only 1 + log2(N - 1) distinct windows for an overall analysis of N + 1 points, where N equals 2n, with n a positive integer. Application of this theory offers very fast computation of TFDs, since very few analysis windows needed and fast, recursive STFT algorithms can be used. Additionally, it is shown that a minimal set of specially derived orthonormal windows can represent most TFDs, including Reduced Interference Distributions (RIDs) with only three distinct windows plus an impulse window. Finally, the Minimal Window RID (MW-RID) which achieves RID properties with only one distinct window and an impulse window is presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.