• 基于加权KPCA的谱聚类算法


    Weighted Kernel PCA

    KPCA算法是基础,快速了解请查阅我的博客

    为了提高KPCA的鲁棒性和稀疏性,可以添加权重,对于噪声点可以减少其权重。原来的公式基础上,引入对称半正定权重矩阵(V)

    [eqalign{& mathop {max} _{ w,e}J_p(w,e) =gamma frac{1}{2}e^TVe−frac{1}{2}w^Tw cr & e=Phi w cr & Phi = egin{bmatrix} phi(x_1)^T; ...;phi(x_N)^T end{bmatrix}\ cr & V = V^T >0} ]

    同样用Lagrangian求解:

    [L(w, e;alpha) = gamma frac{1}{2}e^TVe -frac{1}{2} w^Tw- alpha ^T(e_ - phi w) ]

    最优时,有

    [frac {partial L}{partial w} = 0 ightarrow w = Phi ^T alpha ]

    [frac {partial L}{partial e} = 0 ightarrow alpha = gamma V e]

    [frac {partial L}{partial alpha} = 0 ightarrow e = Phi w]

    消去(w, e)得到非对称矩阵的特征值分解问题:

    [egin{align}V Omega alpha = lambda alphaend{align} ]

    (V Omega)可能不是对称的,但是因为(V, Omega)时正定的, 所以(V Omega)也是正定的。

    测试数据的(x)的投影坐标为:

    [z(x) = w^T phi(x) = sum _{l=1}^N alpha _l kappa (x_l, x) ]

    谱聚类的联系

    kernel alignment

    [Omega ar q = lambda ar q ]

    Markov Random Walks

    [D^{-1}W r=lambda r ]

    normalized cut

    [L ar q = lambda D ar q ]

    NJW

    [(D^{-1}WD^{-frac{1}{2}}) ar q = lambda D^{-frac{1}{2}}ar q ]

    Method Original Problem V Relaxed Solution
    Alignment $$Omega q = lambda q$$ $$I_N$$ $$alpha^{(1)}$$
    Ncut $$L q = lambda D q$$ $$D^{-1}$$ $$alpha^{(2)}$$
    Random walks $$D^{-1}W q=lambda q$$ $$D^{-1}$$ $$alpha^{(2)}$$
    NJW $$(D{-1}WD{-frac{1}{2}}) ar q = lambda D^{-frac{1}{2}}ar q$$ $$D^{-1}$$ $$D{frac{1}{2}}alpha{(2)}$$

    带偏置的推导

    [eqalign {& mathop{max} _{w,e}J_p(w,e)=gamma frac{1}{2}e^TVe−frac{1}{2}w^Tw cr & e=Phi w + b 1_N cr & Phi = egin{bmatrix} phi(x_1)^T; ...;phi(x_N)^T end{bmatrix} cr & V = V^T >0 } ]

    同样用Lagrangian求解:

    [L(w, e;alpha) = gamma frac{1}{2}e^TVe -frac{1}{2} w^Tw- alpha ^T(e_ - phi w- b1_N) ]

    最优时,有

    [eqalign{& frac {partial L}{partial w} = 0 ightarrow w = Phi ^T alpha cr & frac {partial L}{partial e} = 0 ightarrow alpha = gamma V e cr & frac {partial L}{partial b} = 0 ightarrow 1_N^T alpha = 0 cr & frac {partial L}{partial alpha} = 0 ightarrow e = Phi w+ b 1_N}]

    解得:

    [b = -frac{1}{1_N^T V 1_N}1_N^T V Omega alpha ]

    消去(w, e)得到特征值分解问题:

    [egin{align}M Omega alpha = lambda alphaend{align} ]

    其中 $$M = V-frac{1}{1_N^T V 1_N} V 1_N 1_N^T V $$

    测试数据的(x)的投影坐标为:

    [z(x) = w^T phi(x) +b= sum _{l=1}^N alpha _l kappa (x_l, x)+b ]

    测试数据的类别可以通过以下得到评估,有疑问请查阅博客

    [q(x) = { m sign}({w^{T}} phi (x)- heta) \={ m sign} left( {sum limits_{l=1}^{N}}{alpha_{l}}K(x_{l}, x)- heta ight) ]

    参考文献

    [1]. Alzate C, Suykens J A K. A weighted kernel PCA formulation with out-of-sample extensions for spectral clustering methods[C]//Neural Networks, 2006. IJCNN'06. International Joint Conference on. IEEE, 2006: 138-144.

    [2]. Bengio Y, Paiement J, Vincent P, et al. Out-of-sample extensions for lle, isomap, mds, eigenmaps, and spectral clustering[C]//Advances in neural information processing systems. 2004: 177-184.

    [3]. C. Alzate and J. A. K. Suykens. Kernel principal component analysis using an epsilon insensitive robust loss function. Internal report 06-03.Submitted for publication, ESAT-SISTA, K. U. Leuven, 2006

  • 相关阅读:
    css做导航
    css和div
    表格技巧
    HTML表单
    ASP.NET MVC 中的强类型控件
    Aspose.Cells导入与导出excel
    webpack 入门
    asp.net my sqlHelper
    visual studio下用mysql数据库建EF遇到的问题及解决方法
    asp.net mvc 无刷新高效分页
  • 原文地址:https://www.cnblogs.com/hainingwyx/p/6845382.html
Copyright © 2020-2023  润新知