确定研究方向后一直在狂补理论,最近看了一些文章,有了些想法,顺便也总结了representation系列的文章,由于我刚接触,可能会有些不足,愿大家共同指正。
从稀疏表示到低秩表示系列文章包括如下内容:
二、NCSR(NonlocallyCentralized Sparse Representation)
三、GHP(GradientHistogram Preservation)
五、Rankdecomposition
此部分是上篇的续篇,介绍低秩分解
Group sparsity 毕竟只是对group的非零项的约束,属于一维的,但是rank描述的是矩阵的相关性属性,其中的每一个atomy因为位置和排列呈现出二维的秩序,所以rank decomposition 是从2D的角度剖析相似度的structure。
求解矩阵的秩问题可以转变为奇异值分解问题(可以参考矩阵论相关知识)为此,提出了Nuclear norm:
Nuclear Norm Minimization (NNM)
NNM: pros and cons
奇异值的个数等于rank的大小,但是值大的奇异值永远是是一小部分,数值大的几个奇异值大约占总奇异值大小的99%,于是对于奇异值的处理又出现权重不同的处理:
Weighted nuclear norm minimization (WNNM)
Optimization of WNNM
【1】Q. Xie, D. Meng, S. Gu, L.Zhang, W. Zuo, X. Feng, and Z. Xu, “On the optimization of
weightednuclear norm minimization,” Technical Report, to be onlinesoon.
An important corollary
【2】Q. Xie, D. Meng, S. Gu, L.Zhang, W. Zuo, X. Feng, and Z. Xu, “On the optimization of
weightednuclear norm minimization,”Technical Report, to be onlinesoon.
Application of WNNM to image denoising
1. For each noisy patch, search inthe image for its nonlocal similar patchesto form matrix Y.
2. Solve the WNNM problem to estimate the clean patchesX from Y.
3. Put the clean patchback to the image.
4. Repeat the above procedures several times toobtain the denoised image.
WNNM based image denoising
【3】S. Gu, L. Zhang, W. Zuo and X.Feng,“WeightedNuclear Norm Minimization with Application to Image Denoising,”CVPR 2014.
About the weights
Results:效果还是很不错的
Thinking: What representation is better for imageinterpretation?
What’s next?
• Actually I don’tknow … Probably “Sparse/Low-rank + Big Data”?
– Theoreticalanalysis? – Algorithms and implementation?
• W.r.t. image restoration,one interesting topic (at least I think) isperceptual quality oriented image restoration.
未完,待续,更多请关注http://blog.csdn.net/tiandijun,欢迎交流!