Fast low-rank shared dictionary learning
for object classification

LRSDL


ABSTRACT

Despite the fact that different objects possess distinct class-specific features, they also usually share common patterns. This observation has been exploited partially in a recently proposed dictionary learning framework by separating the particularity and the commonality (COPAR). Inspired by this, we propose a novel method to explicitly and simultaneously learn a set of common patterns as well as class-specific features for classification with more intuitive constraints. Our dictionary learning framework is hence characterized by both a shared dictionary and particular (class-specific) dictionaries. For the shared dictionary, we enforce a low-rank constraint, i.e. claim that its spanning subspace should have low dimension and the coefficients corresponding to this dictionary should be similar. For the particular dictionaries, we impose on them the well-known constraints stated in the Fisher discrimination dictionary learning (FDDL). Further, we propose new fast and accurate algorithms to solve the subproblems problems in the learning step, accelerating its convergence. The said algorithms could also be applied to FDDL and its extensions. The efficiencies of these algorithms are theoretically and experimentally verified by comparing their complexities and running time with those of other well-known dictionary learning methods. Experimental results on widely used image datasets establish the advantages of our method over state-of-the-art dictionary learning methods.

DICTOL - Dictionary learning toolbox

A implementation of LRSDL and other well-known dictionary learning methods can be found in [this Github repository].

The version of this toolbox. You can find it in [this Github repository].

Motivation

Idea Visualization & Cost Function

Convergence Rate & Computational Complexity

Samples from Widely-used Datasets

Classification Results

Related Publications

  1. Tiep H. Vu, Vishal Monga. "Fast Low-rank Shared Dictionary Learning for Image Classification." IEEE Transactions on Image Processing, volume 26, issue 11, pages 5160-5175, November 2017 [paper].

  2. Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." IEEE International Conference on Image Processing (ICIP), pp. 4428-4432, 2016. [ paper], [ICIP poster].

Selected References

  1. Wright, John, et al. "Robust face recognition via sparse representation." IEEE Transactions on Pattern Analysis and Machine Intelligence 31.2 (2009): 210-227. [paper].

  2. Mairal, Julien, et al. "Online learning for matrix factorization and sparse coding." The Journal of Machine Learning Research 11 (2010): 19-60. [paper].

  3. Jiang, Zhuolin, Zhe Lin, and Larry S. Davis. "Label consistent K-SVD: Learning a discriminative dictionary for recognition." IEEE Transactions on Pattern Analysis and Machine Intelligence, 35.11 (2013): 2651-2664. [project page].

  4. Yang, Meng, et al. "Fisher discrimination dictionary learning for sparse representation." IEEE International Conference on Computer Vision (ICCV), 2011. [ paper].

  5. Ramirez, Ignacio, Pablo Sprechmann, and Guillermo Sapiro. "Classification and clustering via dictionary learning with structured incoherence and shared features." IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2010. [paper].

  6. Kong, Shu, and Donghui Wang. "A dictionary learning approach for classification: separating the particularity and the commonality." European Conference on Computer Vision. Springer Berlin Heidelberg, 2012. 186-199. [paper].

  7. A singular value thresholding algorithm for matrix completion." SIAM Journal on Optimization 20.4 (2010): 1956-1982. [paper].

  8. Beck, Amir, and Marc Teboulle. "A fast iterative shrinkage-thresholding algorithm for linear inverse problems." SIAM journal on imaging sciences 2.1 (2009): 183-202. [paper].

  9. The Sparse Modeling Software [project page].

Email
ipal.psu@gmail.com

Address
104 Electrical Engineering East,
University Park, PA 16802, USA

Lab Phone:
814-863-7810
814-867-4564