首页 | 本学科首页   官方微博 | 高级检索  
检索        


Similarity preserving low-rank representation for enhanced data representation and effective subspace learning
Institution:1. School of Computer Science and Technology, Soochow University, Suzhou 215006, PR China;2. Department of Electrical and Computer Engineering, National University of Singapore, Singapore;3. Department of Electronic Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong;1. School of Computer Science and Technology, Xian University of Posts & Telecommunications, 710121, China;2. MOEKLINNS Lab, Department of Computer Science and Technology, Xian Jiaotong University, 710049, China;1. School of Electronic and Information Engineering, Xi’an Jiaotong University, Xi’an 710049, PR China;2. State Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210093, PR China;1. School of Science, Beijing University of Posts and Telecommunications, Beijing 100876, China;2. Information Security Center, State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China;3. Zhejiang Provincial Key Lab of Data Storage and Transmission Technology, Hangzhou Dianzi University, Hangzhou 310018, Zhejiang, China;1. College of Science, Huazhong Agricultural University, Wuhan 430070, China;2. Faculty of Mathematics and Statistics, Hubei University, Wuhan 430062, China;3. Department of Computer and Information Science, University of Macau, Macau 999078, China
Abstract:Latent Low-Rank Representation (LatLRR) delivers robust and promising results for subspace recovery and feature extraction through mining the so-called hidden effects, but the locality of both similar principal and salient features cannot be preserved in the optimizations. To solve this issue for achieving enhanced performance, a boosted version of LatLRR, referred to as Regularized Low-Rank Representation (rLRR), is proposed through explicitly including an appropriate Laplacian regularization that can maximally preserve the similarity among local features. Resembling LatLRR, rLRR decomposes given data matrix from two directions by seeking a pair of low-rank matrices. But the similarities of principal and salient features can be effectively preserved by rLRR. As a result, the correlated features are well grouped and the robustness of representations is also enhanced. Based on the outputted bi-directional low-rank codes by rLRR, an unsupervised subspace learning framework termed Low-rank Similarity Preserving Projections (LSPP) is also derived for feature learning. The supervised extension of LSPP is also discussed for discriminant subspace learning. The validity of rLRR is examined by robust representation and decomposition of real images. Results demonstrated the superiority of our rLRR and LSPP in comparison to other related state-of-the-art algorithms.
Keywords:Low-rank representation  Subspace recovery  Similarity preservation  Laplacian regularization  Enhanced representation  Feature learning
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号