school

UM E-Theses Collection (澳門大學電子學位論文庫)

check Full Text
Title

Atomic representation for subspace clustering and pattern classification

English Abstract

High-dimensional (HD) data are widespread in the real world, such as images of million pixels, and videos of million frames. Recent research suggests that instead of uniformly distributed in the ambient space, high-dimensional data often have lowdimensional (LD) structures. For example, facial images of a subject under varying illumination approximately lie in a low-dimensional subspace. Feature trajectories of a rigidly moving object in a video can be well characterized by a low-dimensional subspace. Learning the LD structures helps to not only improve the performance but also reduce the computational cost of many learning algorithms of interest. In the research, we consider a general unified model called atomic representation (AR) for HD data analysis, which generalizes the well known sparse representation approach. The first part aims to develop a general atomic representation based framework for subspace clustering, which is referred to as ARSC for short. Specifically, this work addresses the problem of clustering data points that are approximately drawn from multiple subspaces. The proposed atomic representation based framework includes but not limited to many popular subspace clustering methods as special cases. Most existing subspace clustering methods utilize the Mean Square Error (MSE) criterion as the loss function. Since MSE relies heavily on the Gaussianity assumption, the previous MSE based subspace clustering methods have the limitation of being sensitive to non-Gaussian noise. To alleviate such limitation, we use ARSC as a general platform and develop a novel subspace clustering method, termed MEESSC (Minimum Error Entropy based Sparse Subspace Clustering) by specifying the Minimum Error Entropy (MEE) as the loss function and the sparsity inducing atomic set. We show that MEESSC can well overcome the above limitation using extensive experiments on both synthetic and real data. The second part focuses on the work on a structural atomic representation-based classifier framework with applications to handwritten digit and face recognition. Recently, a large family of representation based classification methods have been proposed and attracted great interest in pattern recognition and computer vision. We present a general framework, termed as atomic representation based classifier (ARC), to systematically unify many representation based classification methods. Despite decent performance, most representation based classification methods treat test samples separately and fail to consider the correlation between the test samples. For this reason, we develop a structural atomic representation based classifier (SARC) based on Bayesian analysis and generalizing a Markov random field based multilevel logistic prior. The proposed SARC can utilize the structural information among the test data to further improve the performance of every representation based classifier belonging to the ARC framework. The experimental results on both synthetic and real databases demonstrate the effectiveness of the proposed framework. The third part is devoted to the work on the robust face recognition by developing a minimum error entropy based atomic representation (MEEAR) framework. Unlike existing MSE based representation-based classifiers (RCs), our framework is based on the minimum error entropy criterion, which is shown to be much more robust to noise. As a general framework, MEEAR is also used as a platform to develop two new pattern classification methods. The experimental results on popular face databases show the efficacy of MEEAR compared with state-of-the-art MSE based RCs.

Issue date

2016.

Author

Wang, Yu Long

Faculty
Faculty of Science and Technology
Department
Department of Computer and Information Science
Degree

Ph.D.

Subject

Image processing -- Digital techniques

Pattern recognition systems

Supervisor

Tang Yuan Yan

Files In This Item

Full-text (Intranet only)

Location
1/F Zone C
Library URL
991005817359706306