Sparse representation with global and nonlocal self-similarity prior for single image super-resolution

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Nonlocal self-similarity sparse representation models exhibit good performance in single image super-resolution (SR) application. However, due to the independent coding process of each image patch, the global similarity information among all similar image patches in whole image is lost. Consequently, the similar image patches may be encoded as the totally different code coefficients. In this paper, considering that low-rank constraint is better at capturing the global similarity information, a new sparse representation model combining the global low-rank prior and the nonlocal self-similarity prior simultaneously is proposed for single image super-resolution. The weighted nuclear norm minimization (WNNM) method is then introduced to effectively solve the proposed model. Extensive experimental results validate that the presented model achieves convincing improvement over many state-of-the-art SR models both quantitatively and perceptually.

Cite

CITATION STYLE

APA

Gong, W., Chen, X., Li, J., Tang, Y., & Li, W. (2017). Sparse representation with global and nonlocal self-similarity prior for single image super-resolution. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10262 LNCS, pp. 222–230). Springer Verlag. https://doi.org/10.1007/978-3-319-59081-3_27

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free