• 矩阵的复习回顾


    矩阵的转置:     AT= (aji)    其中 A= (aij)

    矩阵的共轭:     (aji)    其中 A= (aij)

    //----------------------------------------------------------------------------------------------------

    以下转载自:http://fourier.eng.hmc.edu/e161/lectures/klt/node3.html

    下文在其基础上添加了解释和说明。

    Karhunen-Loeve Transform (KLT)

    Now we consider the Karhunen-Loeve Transform (KLT) (also known as Hotelling Transform and Eigenvector Transform), which is closely related to the Principal Component Analysis (PCA) and widely used in data analysis in many fields.

    Let ${fphi}_k$ be the eigenvector corresponding to the kth eigenvalue $lambda_k$ of the covariance matrix ${fSigma}_x$, i.e.,

    egin{displaymath}{fSigma}_x {fphi}_k=lambda_k{fphi}_k;;;;;;(k=1,cdots,N) end{displaymath}


    or in matrix form:

    egin{displaymath}left[ egin{array}{ccc}cdots &cdots &cdots \
cdots & ...
...fphi}_k   end{array} 
ight]
;;;;;;(k=1,cdots,N) end{displaymath}


    As the covariance matrix ${fSigma}_x={fSigma}_x^{*T}$ is Hermitian (symmetric if ${f x}$ is real), its eigenvector ${fphi}_i$'s are orthogonal:

    (Hermit矩阵是对称矩阵的推广)

    egin{displaymath}langle {fphi}_i,{fphi}_j
angle={fphi}^T_i {fphi...
...{ egin{array}{ll} 1 & i=j  0 & i
e j end{array} 
ight. end{displaymath}


    and we can construct an $N 	imes N$ unitary (orthogonal if ${f x}$ is real) matrix ${fPhi}$

    egin{displaymath}{fPhi}stackrel{	riangle}{=}[{fphi}_1, cdots,{fphi}_{N}] end{displaymath}


    satisfying

    (U矩阵是正交矩阵的推广)

    egin{displaymath}{fPhi}^{*T} {fPhi} = {f I},;;;;mbox{i.e.,};;;;
{fPhi}^{-1}={fPhi}^{*T} end{displaymath}


    The $N$ eigenequations above can be combined to be expressed as:

    egin{displaymath}{fSigma}_x{fPhi}={fPhi}{fLambda} end{displaymath}


    or in matrix form:

    egin{displaymath}
left[ egin{array}{ccc}ddots &cdots &cdots \
vdots &...
... & vdots \
0 & cdots & lambda_{N}
end{array} 
ight]
end{displaymath}


    Here ${fLambda}$ is a diagonal matrix ${fLambda}=diag(lambda_1, cdots,
lambda_{N} )$. Left multiplying ${fPhi}^T={fPhi}^{-1}$ on both sides, the covariance matrix ${fSigma}_x$ can be diagonalized:

    egin{displaymath}{fPhi}^{*T}{fSigma}_x{fPhi}={fPhi}^{-1} {fSigm...
... {fPhi}
= {fPhi}^{-1}{fPhi}{fLambda}={fLambda} end{displaymath}


    Now, given a signal vector ${f x}$, we can define a unitary (orthogonal if ${f x}$ is real) Karhunen-Loeve Transform of ${f x}$ as:

    egin{displaymath}
{f y}=left[ egin{array}{c} y_1 vdots  y_{N} end{...
...ht]left[egin{array}{c}x_1 vdots, x_Nend{array}
ight]
end{displaymath}


    where the ith component $y_i$ of the transform vector is the projection of ${f x}$ onto ${fphi_i}$:

    egin{displaymath}
y_i=langle {fphi}_i,{f x} 
angle={fphi}_i^T{f x}^*
end{displaymath}


    Left multiplying ${fPhi}=({fPhi}^{*T})^{-1}$ on both sides of the transform ${f y}={fPhi}^{*T} {f x}$, we get the inverse transform:

    egin{displaymath}
{f x}={fPhi} {f y}=left[egin{array}{ccc}&& {f...
...dots  y_{N} end{array} 
ight]
=sum_{i=1}^{N} y_i phi_i
end{displaymath}


    We see that by this transform, the signal vector ${f x}$ is now expressed in an N-dimensional space spanned by the N eigenvectors ${fphi}_i$ ($i=1,cdots,N$) as the basis vectors of the space.

     
    _________________________________________________________________________________________________________________________________________________
    每一个不曾起舞的日子,都是对生命的辜负。
    But it is the same with man as with the tree. The more he seeks to rise into the height and light, the more vigorously do his roots struggle earthward, downward, into the dark, the deep - into evil.
    其实人跟树是一样的,越是向往高处的阳光,它的根就越要伸向黑暗的地底。----尼采
  • 相关阅读:
    Python3.7.1学习(三)求两个list的差集、并集与交集
    Python3.7.1学习(二)使用schedule模块定时执行任务
    Python3.7.1学习(一):redis的连接和简单使用
    requests保存图片
    requests模拟登陆的三种方式
    requests模块使用代理
    requests模块发送带headers的Get请求和带参数的请求
    python3.7.1安装Scrapy爬虫框架
    python-生成器
    python-迭代器
  • 原文地址:https://www.cnblogs.com/leoking01/p/14670770.html
Copyright © 2020-2023  润新知