线性回归的代价函数正则化后为
[Jleft( heta ight){ m{ = }}frac{{ m{1}}}{{{ m{2}}m}}left[ {sumlimits_{i = 1}^m {{{left( {{h_ heta }left( {{x^{left( i ight)}}} ight) - {y^{left( i ight)}}} ight)}^2}} + lambda sumlimits_{j = 1}^n { heta _j^2} } ight]]
此时梯度下降算法
重复{
[{ heta _0}: = { heta _0} - alpha left[ {frac{1}{m}sumlimits_{i = 1}^m {left( {{h_ heta }left( {{x^{left( i ight)}}} ight) - {y^{left( i ight)}}} ight)x_0^{left( i ight)}} } ight]]
[{ heta _j}: = { heta _j} - alpha left[ {frac{1}{m}sumlimits_{i = 1}^m {left( {{h_ heta }left( {{x^{left( i ight)}}} ight) - {y^{left( i ight)}}} ight)x_j^{left( i ight)} + frac{lambda }{m}{ heta _j}} } ight]left( {j = 1,2,...,n} ight)]
}
此时normal equation为
[ heta = {left( {{X^T}X + lambda left[ {egin{array}{*{20}{c}}
0&0&0&0\
0&1&0&0\
.&.&.&.\
0&0&0&1
end{array}}
ight]}
ight)^{ - 1}}{X^T}y]
可以证明,正则化后括号里面的矩阵是可逆的