• 【udacity】机器学习-回归


    Evernote Export

    1.什么是回归?

    regression
    在监督学习中,包括了输入和输出的样本,在此基础上,我们能够通过新的输入来表示结果,映射到输出
    输出包含了离散输出和连续输出

    2.回归与函数逼近

    回归并不是指向平均值回落,而是使用函数形式来逼近一堆数据点

    3.线性回归

    什么是线性方程?

    线性方程就是直线方程,可以理解为

    Y=mx+b

    这里的m是斜率,b是截距,这是一个线性方程而不是平面方程

    什么是回归分析?

    回归分析是统计的概念。这里的想法是观察数据和构建一个方程,使我们可以为丢失的数据或未来数据的预测。

    什么是线性回归?

    线性回归是模型之间的线性关系因变量(Y)和自变量(X1、X2、X3等的关系)

    Y=θ0+θ1X1+θ2X2+...+θnXn

    序号统计学分数编程学分数数据科学分数
    A 50 80 65
    B 80 65 83
    C 60 60 69
    D 95 80 92
    E 95 50 84
    F 40 90 55

    这里的y是一个输出变量,X1,X2...Xn是输入变量和θ0,θ1...θn被称为参数或者权重
    所以在上面的得分数据集中,y是数据科学的评分。X1代表统计得分,X2是在编程得分
    y=θ0+θ1x1+θ2x2

    为什么这些θ就是所谓的权重
    每个θ告诉我们如何重视想用的X在预测的输出。这表示,如果一个特定的θ相比其他值小,相应的X起着预测输出多大的作用。

    为什么会出现错误
    尽管能够根据线性方程预测,但是在现实世界中,情况是多元性的,不可能使用简单的线性方程就会实现预测,这会导致模型的错误。
    所以我们要考虑误差
    绝对误差综合:i=1myi^yi
    误差平方的总和:21i=1m(y^yi)2
    我们会在梯度下降算法以后使用这个表达式来考虑优化模型
    为什么梯度下降与误差平方是有用的?
    梯度下降算法使用被最小化的函数的导数。

    4.找到最佳拟合

    f(x)=c
    E(c)=i=1n(yic)2

    5.多项式的阶

    Xwy
    XTXwXTy
    (XTX)1XTXw(XTX)1XTy
    w=(XTX)1XTy

    6.误差

    出现误差的原因是多方面的

    7.交叉验证

    交叉验证的目的是为了泛化我们的机器学习模型,所以不能直接在测试集中直接建模

    8.小结

    1.回归的历史
    2.模型的选择-过拟合、欠拟合,一般拟合、交叉验证
    3.线性回归和多项式回归
    4.平方误差下的最佳常数和过程中用的微积分
    5.一般处理方法
    6.表示法以及回归的应用

    线性回归算法

    • 解决回归问题
    • 思想简答,实现容易
    • 许多强大的非线性模型的基础
    • 结果具有很好的可解释性
    • 蕴含机器学习中的很多重要思想

    一类机器学习算法的基本思路

    目标:找到a和b,使得i=1m(y(i)ax(i)b)2损失函数(loss function) 尽可能小,当在算法中用拟合的程度来测量,即 效用函数(utility function)
    典型的最小二乘法问题:最小化误差的平方

    a=i=1m(x(i)x)2i=1m(x(i)x)(y(i)y)b=yax

    • 通过分析问题,确定问题的损失函数或者效用函数
    • 通过最优化损失函数或者效用函数,获得机器学习模型
    • 所有参数学习的算法都是为了最终求解某一个值的极值,学习模型最终的参数,找到相应的参数来最优化损失函数或者效用函数
    • 线性回归
    • 多项式回归
    • SVM
    • 神经网络
    • 逻辑回归
    • .............
      最优化原理:在经典的传统算法里使用的也是最优化思路
      凸优化:解决的是特殊的优化思路

    最小二乘法

    简单的线性回归算法

    class SimpleLinearRegression1:
        def __init__(self):
            self.a_ = None
            self.b_ = None
        def fit(self,x_train,y_train):
            '''
            根据训练数据集x_train,y_train训练simple linear regression模型
            :param x_train: 
            :param y_train: 
            :return: 
            '''
            assert x_train.ndim ==1 ,"simple linear regressor can only solve single feature training data."
            assert len(x_train) == len(y_train),'the size of x_train must be equal to the size of y_train'
            x_mean = np.mean(x_train)
            y_mean = np.mean(y_train)
            num = 0.0
            d = 0.0
            for x_i,y_i in zip(x_train,y_train):
                num += (x_i-x_mean)*(y_i-y_mean)
                d += (x_i - x_mean) **2
            self.a_ = num /d
            self.b_ = y_mean-a*x_mean
            
        def predict(self, x_predict):
            '''
            给定待预测数据集x_predict,返回x_predict的结果向量
            :param x_predict: 
            :return: 
            '''
            assert x_predict.ndim ==1 ,"simple linear regressor can only solve single feature training data."
            assert self.a_ is not None and self.b_ is not None,'must fit before predict!'
            return np.array([self._predict(x) for x in x_predict])
        def _predict(self,x_single):
            '''
            给定单个待预测数据x_single,返回x_single的预测结果值
            :param x_single: 
            :return: 
            '''
            return self.a_ * x_single + self.b_
        def __repr__(self):
            return 'SimpleLinearRegression1()'
    		
    		
    reg1 = SimpleLinearRegression1()
    reg1.fit(x,y)
    y_hat1 = reg1.predict(x)
    plt.scatter(x,y)
    plt.plot(x,y_hat1,color='red')
    plt.show()
    

    向量化运算

    使用向量化运算比简单的线性计算运算速度上要快的多

    class SimpleLinearRegression2:
        def __init__(self):
            self.a_ = None
            self.b_ = None
        def fit(self,x_train,y_train):
            '''
            根据训练数据集x_train,y_train训练simple linear regression模型
            :param x_train: 
            :param y_train: 
            :return: 
            '''
            assert x_train.ndim ==1 ,"simple linear regressor can only solve single feature training data."
            assert len(x_train) == len(y_train),'the size of x_train must be equal to the size of y_train'
            x_mean = np.mean(x_train)
            y_mean = np.mean(y_train)
            num = (x_train-x_mean).dot(y_train-y_mean)
            d = (x_train-x_mean).dot(x_train-x_mean)
    
            self.a_ = num /d
            self.b_ = y_mean-a*x_mean
            
        def predict(self, x_predict):
            '''
            给定待预测数据集x_predict,返回x_predict的结果向量
            :param x_predict: 
            :return: 
            '''
            assert x_predict.ndim ==1 ,"simple linear regressor can only solve single feature training data."
            assert self.a_ is not None and self.b_ is not None,'must fit before predict!'
            return np.array([self._predict(x) for x in x_predict])
        def _predict(self,x_single):
            '''
            给定单个待预测数据x_single,返回x_single的预测结果值
            :param x_single: 
            :return: 
            '''
            return self.a_ * x_single + self.b_
        def __repr__(self):
            return 'SimpleLinearRegression2()'
    		
    reg2 = SimpleLinearRegression2()
    reg2.fit(x,y)
    print(reg2.a_, reg2.b_)
    y_hat2 = reg2.predict(x)
    plt.scatter(x,y)
    plt.plot(x,y_hat2,color='red')
    plt.show()
    


    性能测试,线性回归向量化运算比线性运算快约50倍

    m = 1000000
    big_x = np.random.random(size=m)
    big_y = big_x * 2.0 + 3.0 + np.random.normal(size=m)
    %timeit reg1.fit(big_x, big_y)
    %timeit reg2.fit(big_x, big_y)
    
    >>>
    962 ms ± 60.5 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
    25.9 ms ± 286 µs per loop (mean ± std. dev. of 7 runs, 10 loops each)
    >>>
    

    回归算法的评价

     
     

    线性回归算法的评测

    m1i=1m(ytest(i)y^test(i))2

    均方误差MSE(Mean Squared Error)

    m1i=1m(ytest(i)y^test(i))2=MEStest

    均方根误差RMSE(Root Mean Squared Error)

    m1i=1mytest(i)y^test(i)

    平均绝对误差MAE(Mean Absolute Error)

    R2=1SStotalSSresidual

    R2=1(y(i)y(i))2(y^(i)y(i))2

    衡量了自己的模型没有产生错误的指标

    • R2<=1
    • R2 越大越好。当我们的预测模型不犯任何错误时,R^2得到最大值1
    • 当我们的模型等于基准模型时,R2 为0
    • 如果R2<0,说明我们学习到的模型还不如基准模型。此时,很有可能我们的数据不存在任何线性关系。

    sklearn中的线性回归算法中的score就是使用的R2评估标准

    多元线性回归


    数据有多少特征有多少维度,就会有多少个θ表示,θ0就是截距
    目标 找到θ0 , θ1,θ2 , ...,θn 使得i=1m(y(i)y^(i))2尽可能小
    (yXbθ)T(yXbθ)尽可能小
    多元线性回归的正规方程解(Normal Equation)
    θ=(XbTXb)1XbTy
    问题:时间复杂度高 O(n3)优化O(n2.4)
    有点:不需要对数据进行归一化处理
    解决方案:

    %23%23%23%23%201.%E4%BB%80%E4%B9%88%E6%98%AF%E5%9B%9E%E5%BD%92%EF%BC%9F%0Aregression%0A%E5%9C%A8%E7%9B%91%E7%9D%A3%E5%AD%A6%E4%B9%A0%E4%B8%AD%EF%BC%8C%E5%8C%85%E6%8B%AC%E4%BA%86%E8%BE%93%E5%85%A5%E5%92%8C%E8%BE%93%E5%87%BA%E7%9A%84%E6%A0%B7%E6%9C%AC%EF%BC%8C%E5%9C%A8%E6%AD%A4%E5%9F%BA%E7%A1%80%E4%B8%8A%EF%BC%8C%E6%88%91%E4%BB%AC%E8%83%BD%E5%A4%9F%E9%80%9A%E8%BF%87%E6%96%B0%E7%9A%84%E8%BE%93%E5%85%A5%E6%9D%A5%E8%A1%A8%E7%A4%BA%E7%BB%93%E6%9E%9C%EF%BC%8C%E6%98%A0%E5%B0%84%E5%88%B0%E8%BE%93%E5%87%BA%0A%E8%BE%93%E5%87%BA%E5%8C%85%E5%90%AB%E4%BA%86%E7%A6%BB%E6%95%A3%E8%BE%93%E5%87%BA%E5%92%8C%E8%BF%9E%E7%BB%AD%E8%BE%93%E5%87%BA%0A%0A%23%23%23%23%202.%E5%9B%9E%E5%BD%92%E4%B8%8E%E5%87%BD%E6%95%B0%E9%80%BC%E8%BF%91%0A%E5%9B%9E%E5%BD%92%E5%B9%B6%E4%B8%8D%E6%98%AF%E6%8C%87%E5%90%91%E5%B9%B3%E5%9D%87%E5%80%BC%E5%9B%9E%E8%90%BD%EF%BC%8C%E8%80%8C%E6%98%AF%E4%BD%BF%E7%94%A8%E5%87%BD%E6%95%B0%E5%BD%A2%E5%BC%8F%E6%9D%A5%E9%80%BC%E8%BF%91%E4%B8%80%E5%A0%86%E6%95%B0%E6%8D%AE%E7%82%B9%0A%0A%23%23%23%23%203.%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%0A**%E4%BB%80%E4%B9%88%E6%98%AF%E7%BA%BF%E6%80%A7%E6%96%B9%E7%A8%8B%EF%BC%9F**%0A%3E%E7%BA%BF%E6%80%A7%E6%96%B9%E7%A8%8B%E5%B0%B1%E6%98%AF%E7%9B%B4%E7%BA%BF%E6%96%B9%E7%A8%8B%EF%BC%8C%E5%8F%AF%E4%BB%A5%E7%90%86%E8%A7%A3%E4%B8%BA%0A%24%24Y%3Dmx%2Bb%24%24%0A%3E%E8%BF%99%E9%87%8C%E7%9A%84m%E6%98%AF%E6%96%9C%E7%8E%87%EF%BC%8Cb%E6%98%AF%E6%88%AA%E8%B7%9D%EF%BC%8C%E8%BF%99%E6%98%AF%E4%B8%80%E4%B8%AA%E7%BA%BF%E6%80%A7%E6%96%B9%E7%A8%8B%E8%80%8C%E4%B8%8D%E6%98%AF%E5%B9%B3%E9%9D%A2%E6%96%B9%E7%A8%8B%0A%0A**%E4%BB%80%E4%B9%88%E6%98%AF%E5%9B%9E%E5%BD%92%E5%88%86%E6%9E%90%EF%BC%9F**%0A%3E%E5%9B%9E%E5%BD%92%E5%88%86%E6%9E%90%E6%98%AF%E7%BB%9F%E8%AE%A1%E7%9A%84%E6%A6%82%E5%BF%B5%E3%80%82%E8%BF%99%E9%87%8C%E7%9A%84%E6%83%B3%E6%B3%95%E6%98%AF%E8%A7%82%E5%AF%9F%E6%95%B0%E6%8D%AE%E5%92%8C%E6%9E%84%E5%BB%BA%E4%B8%80%E4%B8%AA%E6%96%B9%E7%A8%8B%EF%BC%8C%E4%BD%BF%E6%88%91%E4%BB%AC%E5%8F%AF%E4%BB%A5%E4%B8%BA%E4%B8%A2%E5%A4%B1%E7%9A%84%E6%95%B0%E6%8D%AE%E6%88%96%E6%9C%AA%E6%9D%A5%E6%95%B0%E6%8D%AE%E7%9A%84%E9%A2%84%E6%B5%8B%E3%80%82%0A%0A**%E4%BB%80%E4%B9%88%E6%98%AF%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%EF%BC%9F**%0A%3E%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E6%98%AF%E6%A8%A1%E5%9E%8B%E4%B9%8B%E9%97%B4%E7%9A%84%E7%BA%BF%E6%80%A7%E5%85%B3%E7%B3%BB%E5%9B%A0%E5%8F%98%E9%87%8F(Y)%E5%92%8C%E8%87%AA%E5%8F%98%E9%87%8F(X1%E3%80%81X2%E3%80%81X3%E7%AD%89%E7%9A%84%E5%85%B3%E7%B3%BB)%0A%24%24Y%3D%5Ctheta_0%2B%5Ctheta_1X_1%2B%5Ctheta_2X_2%2B...%2B%5Ctheta_nX_n%24%24%0A%7C%E5%BA%8F%E5%8F%B7%7C%E7%BB%9F%E8%AE%A1%E5%AD%A6%E5%88%86%E6%95%B0%7C%E7%BC%96%E7%A8%8B%E5%AD%A6%E5%88%86%E6%95%B0%7C%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E5%88%86%E6%95%B0%7C%0A%7C---%7C------------%7C----------%7C------------%7C%0AA%7C50%7C80%7C65%7C%0AB%7C80%7C65%7C83%7C%0AC%7C60%7C60%7C69%7C%0AD%7C95%7C80%7C92%7C%0AE%7C95%7C50%7C84%7C%0AF%7C40%7C90%7C55%7C%0A%3E%E8%BF%99%E9%87%8C%E7%9A%84%24%5Coverline%7By%7D%24%E6%98%AF%E4%B8%80%E4%B8%AA%E8%BE%93%E5%87%BA%E5%8F%98%E9%87%8F%EF%BC%8C%24X_1%2CX_2...X_n%24%E6%98%AF%E8%BE%93%E5%85%A5%E5%8F%98%E9%87%8F%E5%92%8C%24%5Ctheta_0%2C%5Ctheta_1...%5Ctheta_n%24%E8%A2%AB%E7%A7%B0%E4%B8%BA%E5%8F%82%E6%95%B0%E6%88%96%E8%80%85%E6%9D%83%E9%87%8D%0A%E6%89%80%E4%BB%A5%E5%9C%A8%E4%B8%8A%E9%9D%A2%E7%9A%84%E5%BE%97%E5%88%86%E6%95%B0%E6%8D%AE%E9%9B%86%E4%B8%AD%EF%BC%8C%24%5Coverline%7By%7D%24%E6%98%AF%E6%95%B0%E6%8D%AE%E7%A7%91%E5%AD%A6%E7%9A%84%E8%AF%84%E5%88%86%E3%80%82%24X_1%24%E4%BB%A3%E8%A1%A8%E7%BB%9F%E8%AE%A1%E5%BE%97%E5%88%86%EF%BC%8C%24X_2%24%E6%98%AF%E5%9C%A8%E7%BC%96%E7%A8%8B%E5%BE%97%E5%88%86%0A%24y%3D%5Ctheta_0%2B%5Ctheta_1x_1%2B%5Ctheta_2x_2%24%0A%0A**%E4%B8%BA%E4%BB%80%E4%B9%88%E8%BF%99%E4%BA%9B%24%5Ctheta%24%E5%B0%B1%E6%98%AF%E6%89%80%E8%B0%93%E7%9A%84%E6%9D%83%E9%87%8D**%0A%E6%AF%8F%E4%B8%AA%24%5Ctheta%24%E5%91%8A%E8%AF%89%E6%88%91%E4%BB%AC%E5%A6%82%E4%BD%95%E9%87%8D%E8%A7%86%E6%83%B3%E7%94%A8%E7%9A%84X%E5%9C%A8%E9%A2%84%E6%B5%8B%E7%9A%84%E8%BE%93%E5%87%BA%E3%80%82%E8%BF%99%E8%A1%A8%E7%A4%BA%EF%BC%8C%E5%A6%82%E6%9E%9C%E4%B8%80%E4%B8%AA%E7%89%B9%E5%AE%9A%E7%9A%84%24%5Ctheta%24%E7%9B%B8%E6%AF%94%E5%85%B6%E4%BB%96%E5%80%BC%E5%B0%8F%EF%BC%8C%E7%9B%B8%E5%BA%94%E7%9A%84X%E8%B5%B7%E7%9D%80%E9%A2%84%E6%B5%8B%E8%BE%93%E5%87%BA%E5%A4%9A%E5%A4%A7%E7%9A%84%E4%BD%9C%E7%94%A8%E3%80%82%0A%0A**%E4%B8%BA%E4%BB%80%E4%B9%88%E4%BC%9A%E5%87%BA%E7%8E%B0%E9%94%99%E8%AF%AF**%0A%E5%B0%BD%E7%AE%A1%E8%83%BD%E5%A4%9F%E6%A0%B9%E6%8D%AE%E7%BA%BF%E6%80%A7%E6%96%B9%E7%A8%8B%E9%A2%84%E6%B5%8B%EF%BC%8C%E4%BD%86%E6%98%AF%E5%9C%A8%E7%8E%B0%E5%AE%9E%E4%B8%96%E7%95%8C%E4%B8%AD%EF%BC%8C%E6%83%85%E5%86%B5%E6%98%AF%E5%A4%9A%E5%85%83%E6%80%A7%E7%9A%84%EF%BC%8C%E4%B8%8D%E5%8F%AF%E8%83%BD%E4%BD%BF%E7%94%A8%E7%AE%80%E5%8D%95%E7%9A%84%E7%BA%BF%E6%80%A7%E6%96%B9%E7%A8%8B%E5%B0%B1%E4%BC%9A%E5%AE%9E%E7%8E%B0%E9%A2%84%E6%B5%8B%EF%BC%8C%E8%BF%99%E4%BC%9A%E5%AF%BC%E8%87%B4%E6%A8%A1%E5%9E%8B%E7%9A%84%E9%94%99%E8%AF%AF%E3%80%82%0A%E6%89%80%E4%BB%A5%E6%88%91%E4%BB%AC%E8%A6%81%E8%80%83%E8%99%91%E8%AF%AF%E5%B7%AE%0A%E7%BB%9D%E5%AF%B9%E8%AF%AF%E5%B7%AE%E7%BB%BC%E5%90%88%EF%BC%9A%24%5Csum_%7Bi%3D1%7D%5Em%7C%5Chat%7By_i%7D-y_i%7C%24%0A%E8%AF%AF%E5%B7%AE%E5%B9%B3%E6%96%B9%E7%9A%84%E6%80%BB%E5%92%8C%EF%BC%9A%24%5Cfrac%7B1%7D%7B2%7D%5Csum%5Em_%7Bi%3D1%7D(%5Chat%7By%7D-y_i)%5E2%24%0A%E6%88%91%E4%BB%AC%E4%BC%9A%E5%9C%A8%E6%A2%AF%E5%BA%A6%E4%B8%8B%E9%99%8D%E7%AE%97%E6%B3%95%E4%BB%A5%E5%90%8E%E4%BD%BF%E7%94%A8%E8%BF%99%E4%B8%AA%E8%A1%A8%E8%BE%BE%E5%BC%8F%E6%9D%A5%E8%80%83%E8%99%91%E4%BC%98%E5%8C%96%E6%A8%A1%E5%9E%8B%0A**%E4%B8%BA%E4%BB%80%E4%B9%88%E6%A2%AF%E5%BA%A6%E4%B8%8B%E9%99%8D%E4%B8%8E%E8%AF%AF%E5%B7%AE%E5%B9%B3%E6%96%B9%E6%98%AF%E6%9C%89%E7%94%A8%E7%9A%84%EF%BC%9F**%0A%E6%A2%AF%E5%BA%A6%E4%B8%8B%E9%99%8D%E7%AE%97%E6%B3%95%E4%BD%BF%E7%94%A8%E8%A2%AB%E6%9C%80%E5%B0%8F%E5%8C%96%E7%9A%84%E5%87%BD%E6%95%B0%E7%9A%84%E5%AF%BC%E6%95%B0%E3%80%82%0A%0A%23%23%23%23%204.%E6%89%BE%E5%88%B0%E6%9C%80%E4%BD%B3%E6%8B%9F%E5%90%88%0A%24f(x)%3Dc%24%0A%24E(c)%3D%5Csum%5En_%7Bi%3D1%7D(y_i-c)%5E2%24%0A%0A%23%23%23%23%205.%E5%A4%9A%E9%A1%B9%E5%BC%8F%E7%9A%84%E9%98%B6%0A%24Xw%5Capprox%20y%24%0A%24X%5ETXw%20%5Capprox%20X%5ETy%24%0A%24(X%5ETX)%5E-1X%5ETXw%20%5Capprox%20(X%5ETX)%5E-1X%5ETy%24%0A%24w%3D(X%5ETX)%5E-1X%5ETy%24%0A%0A%23%23%23%23%206.%E8%AF%AF%E5%B7%AE%0A%3E%E5%87%BA%E7%8E%B0%E8%AF%AF%E5%B7%AE%E7%9A%84%E5%8E%9F%E5%9B%A0%E6%98%AF%E5%A4%9A%E6%96%B9%E9%9D%A2%E7%9A%84%0A%0A%23%23%23%23%207.%E4%BA%A4%E5%8F%89%E9%AA%8C%E8%AF%81%0A%3E%E4%BA%A4%E5%8F%89%E9%AA%8C%E8%AF%81%E7%9A%84%E7%9B%AE%E7%9A%84%E6%98%AF%E4%B8%BA%E4%BA%86%E6%B3%9B%E5%8C%96%E6%88%91%E4%BB%AC%E7%9A%84%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%E6%A8%A1%E5%9E%8B%EF%BC%8C%E6%89%80%E4%BB%A5%E4%B8%8D%E8%83%BD%E7%9B%B4%E6%8E%A5%E5%9C%A8%E6%B5%8B%E8%AF%95%E9%9B%86%E4%B8%AD%E7%9B%B4%E6%8E%A5%E5%BB%BA%E6%A8%A1%0A%0A%23%23%23%23%208.%E5%B0%8F%E7%BB%93%0A%3E1.%E5%9B%9E%E5%BD%92%E7%9A%84%E5%8E%86%E5%8F%B2%0A2.%E6%A8%A1%E5%9E%8B%E7%9A%84%E9%80%89%E6%8B%A9-%E8%BF%87%E6%8B%9F%E5%90%88%E3%80%81%E6%AC%A0%E6%8B%9F%E5%90%88%EF%BC%8C%E4%B8%80%E8%88%AC%E6%8B%9F%E5%90%88%E3%80%81%E4%BA%A4%E5%8F%89%E9%AA%8C%E8%AF%81%0A3.%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E5%92%8C%E5%A4%9A%E9%A1%B9%E5%BC%8F%E5%9B%9E%E5%BD%92%0A4.%E5%B9%B3%E6%96%B9%E8%AF%AF%E5%B7%AE%E4%B8%8B%E7%9A%84%E6%9C%80%E4%BD%B3%E5%B8%B8%E6%95%B0%E5%92%8C%E8%BF%87%E7%A8%8B%E4%B8%AD%E7%94%A8%E7%9A%84%E5%BE%AE%E7%A7%AF%E5%88%86%0A5.%E4%B8%80%E8%88%AC%E5%A4%84%E7%90%86%E6%96%B9%E6%B3%95%0A6.%E8%A1%A8%E7%A4%BA%E6%B3%95%E4%BB%A5%E5%8F%8A%E5%9B%9E%E5%BD%92%E7%9A%84%E5%BA%94%E7%94%A8%0A%0A%23%23%23%20%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E7%AE%97%E6%B3%95%0A*%20%E8%A7%A3%E5%86%B3%E5%9B%9E%E5%BD%92%E9%97%AE%E9%A2%98%0A*%20%E6%80%9D%E6%83%B3%E7%AE%80%E7%AD%94%EF%BC%8C%E5%AE%9E%E7%8E%B0%E5%AE%B9%E6%98%93%0A*%20%E8%AE%B8%E5%A4%9A%E5%BC%BA%E5%A4%A7%E7%9A%84%E9%9D%9E%E7%BA%BF%E6%80%A7%E6%A8%A1%E5%9E%8B%E7%9A%84%E5%9F%BA%E7%A1%80%0A*%20%E7%BB%93%E6%9E%9C%E5%85%B7%E6%9C%89%E5%BE%88%E5%A5%BD%E7%9A%84%E5%8F%AF%E8%A7%A3%E9%87%8A%E6%80%A7%0A*%20%E8%95%B4%E5%90%AB%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%E4%B8%AD%E7%9A%84%E5%BE%88%E5%A4%9A%E9%87%8D%E8%A6%81%E6%80%9D%E6%83%B3%0A%0A!%5B21dd01046da0abbf3314053ec97eb093.png%5D(en-resource%3A%2F%2Fdatabase%2F1306%3A0)%0A%0A%23%23%23%20%E4%B8%80%E7%B1%BB%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%E7%AE%97%E6%B3%95%E7%9A%84%E5%9F%BA%E6%9C%AC%E6%80%9D%E8%B7%AF%0A%3E%E7%9B%AE%E6%A0%87%EF%BC%9A%E6%89%BE%E5%88%B0a%E5%92%8Cb%EF%BC%8C%E4%BD%BF%E5%BE%97%24%5Csum%5Em_%7Bi%3D1%7D(y%5E%7B(i)%7D-ax%5E%7B(i)%7D-b)%5E2%24%E5%8D%B3%20**%E6%8D%9F%E5%A4%B1%E5%87%BD%E6%95%B0(loss%20function)**%20%E5%B0%BD%E5%8F%AF%E8%83%BD%E5%B0%8F%EF%BC%8C%E5%BD%93%E5%9C%A8%E7%AE%97%E6%B3%95%E4%B8%AD%E7%94%A8%E6%8B%9F%E5%90%88%E7%9A%84%E7%A8%8B%E5%BA%A6%E6%9D%A5%E6%B5%8B%E9%87%8F%EF%BC%8C%E5%8D%B3%20**%E6%95%88%E7%94%A8%E5%87%BD%E6%95%B0(utility%20function)**%0A%3E%E5%85%B8%E5%9E%8B%E7%9A%84%E6%9C%80%E5%B0%8F%E4%BA%8C%E4%B9%98%E6%B3%95%E9%97%AE%E9%A2%98%EF%BC%9A%E6%9C%80%E5%B0%8F%E5%8C%96%E8%AF%AF%E5%B7%AE%E7%9A%84%E5%B9%B3%E6%96%B9%0A%3E%24%24%20a%3D%5Cfrac%7B%5Csum%5Em_%7Bi%3D1%7D(x%5E%7B(i)%7D-%5Coverline%20x)(y%5E%7B(i)%7D-%5Coverline%20y)%7D%7B%5Csum%5Em_%7Bi%3D1%7D(x%5E%7B(i)%7D-%5Coverline%20x)%5E2%7D%20%20b%3D%5Coverline%7By%7D-a%5Coverline%20x%24%24%0A*%20%E9%80%9A%E8%BF%87%E5%88%86%E6%9E%90%E9%97%AE%E9%A2%98%EF%BC%8C%E7%A1%AE%E5%AE%9A%E9%97%AE%E9%A2%98%E7%9A%84%E6%8D%9F%E5%A4%B1%E5%87%BD%E6%95%B0%E6%88%96%E8%80%85%E6%95%88%E7%94%A8%E5%87%BD%E6%95%B0%0A*%20%E9%80%9A%E8%BF%87%E6%9C%80%E4%BC%98%E5%8C%96%E6%8D%9F%E5%A4%B1%E5%87%BD%E6%95%B0%E6%88%96%E8%80%85%E6%95%88%E7%94%A8%E5%87%BD%E6%95%B0%EF%BC%8C%E8%8E%B7%E5%BE%97%E6%9C%BA%E5%99%A8%E5%AD%A6%E4%B9%A0%E6%A8%A1%E5%9E%8B%0A*%20%E6%89%80%E6%9C%89%E5%8F%82%E6%95%B0%E5%AD%A6%E4%B9%A0%E7%9A%84%E7%AE%97%E6%B3%95%E9%83%BD%E6%98%AF%E4%B8%BA%E4%BA%86%E6%9C%80%E7%BB%88%E6%B1%82%E8%A7%A3%E6%9F%90%E4%B8%80%E4%B8%AA%E5%80%BC%E7%9A%84%E6%9E%81%E5%80%BC%EF%BC%8C%E5%AD%A6%E4%B9%A0%E6%A8%A1%E5%9E%8B%E6%9C%80%E7%BB%88%E7%9A%84%E5%8F%82%E6%95%B0%EF%BC%8C%E6%89%BE%E5%88%B0%E7%9B%B8%E5%BA%94%E7%9A%84%E5%8F%82%E6%95%B0%E6%9D%A5%E6%9C%80%E4%BC%98%E5%8C%96%E6%8D%9F%E5%A4%B1%E5%87%BD%E6%95%B0%E6%88%96%E8%80%85%E6%95%88%E7%94%A8%E5%87%BD%E6%95%B0%0A*%20%5Bx%5D%20%20%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%0A*%20%5Bx%5D%20%20%E5%A4%9A%E9%A1%B9%E5%BC%8F%E5%9B%9E%E5%BD%92%0A*%20%5Bx%5D%20%20SVM%0A*%20%5Bx%5D%20%20%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%0A*%20%5Bx%5D%20%20%E9%80%BB%E8%BE%91%E5%9B%9E%E5%BD%92%0A*%20%5Bx%5D%20%20.............%0A%E6%9C%80%E4%BC%98%E5%8C%96%E5%8E%9F%E7%90%86%EF%BC%9A%E5%9C%A8%E7%BB%8F%E5%85%B8%E7%9A%84%E4%BC%A0%E7%BB%9F%E7%AE%97%E6%B3%95%E9%87%8C%E4%BD%BF%E7%94%A8%E7%9A%84%E4%B9%9F%E6%98%AF%E6%9C%80%E4%BC%98%E5%8C%96%E6%80%9D%E8%B7%AF%0A%E5%87%B8%E4%BC%98%E5%8C%96%EF%BC%9A%E8%A7%A3%E5%86%B3%E7%9A%84%E6%98%AF%E7%89%B9%E6%AE%8A%E7%9A%84%E4%BC%98%E5%8C%96%E6%80%9D%E8%B7%AF%0A%0A%23%23%23%20%E6%9C%80%E5%B0%8F%E4%BA%8C%E4%B9%98%E6%B3%95%0A!%5Bc41b5160b6a2206d64144f2e1857f68a.png%5D(en-resource%3A%2F%2Fdatabase%2F1308%3A0)%0A%0A%23%23%23%20%E7%AE%80%E5%8D%95%E7%9A%84%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E7%AE%97%E6%B3%95%0A%60%60%60python%0Aclass%20SimpleLinearRegression1%3A%0A%20%20%20%20def%20__init__(self)%3A%0A%20%20%20%20%20%20%20%20self.a_%20%3D%20None%0A%20%20%20%20%20%20%20%20self.b_%20%3D%20None%0A%20%20%20%20def%20fit(self%2Cx_train%2Cy_train)%3A%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20%E6%A0%B9%E6%8D%AE%E8%AE%AD%E7%BB%83%E6%95%B0%E6%8D%AE%E9%9B%86x_train%2Cy_train%E8%AE%AD%E7%BB%83simple%20linear%20regression%E6%A8%A1%E5%9E%8B%0A%20%20%20%20%20%20%20%20%3Aparam%20x_train%3A%20%0A%20%20%20%20%20%20%20%20%3Aparam%20y_train%3A%20%0A%20%20%20%20%20%20%20%20%3Areturn%3A%20%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20assert%20x_train.ndim%20%3D%3D1%20%2C%22simple%20linear%20regressor%20can%20only%20solve%20single%20feature%20training%20data.%22%0A%20%20%20%20%20%20%20%20assert%20len(x_train)%20%3D%3D%20len(y_train)%2C'the%20size%20of%20x_train%20must%20be%20equal%20to%20the%20size%20of%20y_train'%0A%20%20%20%20%20%20%20%20x_mean%20%3D%20np.mean(x_train)%0A%20%20%20%20%20%20%20%20y_mean%20%3D%20np.mean(y_train)%0A%20%20%20%20%20%20%20%20num%20%3D%200.0%0A%20%20%20%20%20%20%20%20d%20%3D%200.0%0A%20%20%20%20%20%20%20%20for%20x_i%2Cy_i%20in%20zip(x_train%2Cy_train)%3A%0A%20%20%20%20%20%20%20%20%20%20%20%20num%20%2B%3D%20(x_i-x_mean)*(y_i-y_mean)%0A%20%20%20%20%20%20%20%20%20%20%20%20d%20%2B%3D%20(x_i%20-%20x_mean)%20**2%0A%20%20%20%20%20%20%20%20self.a_%20%3D%20num%20%2Fd%0A%20%20%20%20%20%20%20%20self.b_%20%3D%20y_mean-a*x_mean%0A%20%20%20%20%20%20%20%20%0A%20%20%20%20def%20predict(self%2C%20x_predict)%3A%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20%E7%BB%99%E5%AE%9A%E5%BE%85%E9%A2%84%E6%B5%8B%E6%95%B0%E6%8D%AE%E9%9B%86x_predict%2C%E8%BF%94%E5%9B%9Ex_predict%E7%9A%84%E7%BB%93%E6%9E%9C%E5%90%91%E9%87%8F%0A%20%20%20%20%20%20%20%20%3Aparam%20x_predict%3A%20%0A%20%20%20%20%20%20%20%20%3Areturn%3A%20%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20assert%20x_predict.ndim%20%3D%3D1%20%2C%22simple%20linear%20regressor%20can%20only%20solve%20single%20feature%20training%20data.%22%0A%20%20%20%20%20%20%20%20assert%20self.a_%20is%20not%20None%20and%20self.b_%20is%20not%20None%2C'must%20fit%20before%20predict!'%0A%20%20%20%20%20%20%20%20return%20np.array(%5Bself._predict(x)%20for%20x%20in%20x_predict%5D)%0A%20%20%20%20def%20_predict(self%2Cx_single)%3A%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20%E7%BB%99%E5%AE%9A%E5%8D%95%E4%B8%AA%E5%BE%85%E9%A2%84%E6%B5%8B%E6%95%B0%E6%8D%AEx_single%EF%BC%8C%E8%BF%94%E5%9B%9Ex_single%E7%9A%84%E9%A2%84%E6%B5%8B%E7%BB%93%E6%9E%9C%E5%80%BC%0A%20%20%20%20%20%20%20%20%3Aparam%20x_single%3A%20%0A%20%20%20%20%20%20%20%20%3Areturn%3A%20%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20return%20self.a_%20*%20x_single%20%2B%20self.b_%0A%20%20%20%20def%20__repr__(self)%3A%0A%20%20%20%20%20%20%20%20return%20'SimpleLinearRegression1()'%0A%09%09%0A%09%09%0Areg1%20%3D%20SimpleLinearRegression1()%0Areg1.fit(x%2Cy)%0Ay_hat1%20%3D%20reg1.predict(x)%0Aplt.scatter(x%2Cy)%0Aplt.plot(x%2Cy_hat1%2Ccolor%3D'red')%0Aplt.show()%0A%60%60%60%0A!%5B8a2f0e821891a33be73482dc569ebf74.png%5D(en-resource%3A%2F%2Fdatabase%2F1310%3A0)%0A%0A%23%23%23%20%E5%90%91%E9%87%8F%E5%8C%96%E8%BF%90%E7%AE%97%0A%E4%BD%BF%E7%94%A8%E5%90%91%E9%87%8F%E5%8C%96%E8%BF%90%E7%AE%97%E6%AF%94%E7%AE%80%E5%8D%95%E7%9A%84%E7%BA%BF%E6%80%A7%E8%AE%A1%E7%AE%97%E8%BF%90%E7%AE%97%E9%80%9F%E5%BA%A6%E4%B8%8A%E8%A6%81%E5%BF%AB%E7%9A%84%E5%A4%9A%0A%60%60%60python%0Aclass%20SimpleLinearRegression2%3A%0A%20%20%20%20def%20__init__(self)%3A%0A%20%20%20%20%20%20%20%20self.a_%20%3D%20None%0A%20%20%20%20%20%20%20%20self.b_%20%3D%20None%0A%20%20%20%20def%20fit(self%2Cx_train%2Cy_train)%3A%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20%E6%A0%B9%E6%8D%AE%E8%AE%AD%E7%BB%83%E6%95%B0%E6%8D%AE%E9%9B%86x_train%2Cy_train%E8%AE%AD%E7%BB%83simple%20linear%20regression%E6%A8%A1%E5%9E%8B%0A%20%20%20%20%20%20%20%20%3Aparam%20x_train%3A%20%0A%20%20%20%20%20%20%20%20%3Aparam%20y_train%3A%20%0A%20%20%20%20%20%20%20%20%3Areturn%3A%20%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20assert%20x_train.ndim%20%3D%3D1%20%2C%22simple%20linear%20regressor%20can%20only%20solve%20single%20feature%20training%20data.%22%0A%20%20%20%20%20%20%20%20assert%20len(x_train)%20%3D%3D%20len(y_train)%2C'the%20size%20of%20x_train%20must%20be%20equal%20to%20the%20size%20of%20y_train'%0A%20%20%20%20%20%20%20%20x_mean%20%3D%20np.mean(x_train)%0A%20%20%20%20%20%20%20%20y_mean%20%3D%20np.mean(y_train)%0A%20%20%20%20%20%20%20%20num%20%3D%20(x_train-x_mean).dot(y_train-y_mean)%0A%20%20%20%20%20%20%20%20d%20%3D%20(x_train-x_mean).dot(x_train-x_mean)%0A%0A%20%20%20%20%20%20%20%20self.a_%20%3D%20num%20%2Fd%0A%20%20%20%20%20%20%20%20self.b_%20%3D%20y_mean-a*x_mean%0A%20%20%20%20%20%20%20%20%0A%20%20%20%20def%20predict(self%2C%20x_predict)%3A%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20%E7%BB%99%E5%AE%9A%E5%BE%85%E9%A2%84%E6%B5%8B%E6%95%B0%E6%8D%AE%E9%9B%86x_predict%2C%E8%BF%94%E5%9B%9Ex_predict%E7%9A%84%E7%BB%93%E6%9E%9C%E5%90%91%E9%87%8F%0A%20%20%20%20%20%20%20%20%3Aparam%20x_predict%3A%20%0A%20%20%20%20%20%20%20%20%3Areturn%3A%20%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20assert%20x_predict.ndim%20%3D%3D1%20%2C%22simple%20linear%20regressor%20can%20only%20solve%20single%20feature%20training%20data.%22%0A%20%20%20%20%20%20%20%20assert%20self.a_%20is%20not%20None%20and%20self.b_%20is%20not%20None%2C'must%20fit%20before%20predict!'%0A%20%20%20%20%20%20%20%20return%20np.array(%5Bself._predict(x)%20for%20x%20in%20x_predict%5D)%0A%20%20%20%20def%20_predict(self%2Cx_single)%3A%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20%E7%BB%99%E5%AE%9A%E5%8D%95%E4%B8%AA%E5%BE%85%E9%A2%84%E6%B5%8B%E6%95%B0%E6%8D%AEx_single%EF%BC%8C%E8%BF%94%E5%9B%9Ex_single%E7%9A%84%E9%A2%84%E6%B5%8B%E7%BB%93%E6%9E%9C%E5%80%BC%0A%20%20%20%20%20%20%20%20%3Aparam%20x_single%3A%20%0A%20%20%20%20%20%20%20%20%3Areturn%3A%20%0A%20%20%20%20%20%20%20%20'''%0A%20%20%20%20%20%20%20%20return%20self.a_%20*%20x_single%20%2B%20self.b_%0A%20%20%20%20def%20__repr__(self)%3A%0A%20%20%20%20%20%20%20%20return%20'SimpleLinearRegression2()'%0A%09%09%0Areg2%20%3D%20SimpleLinearRegression2()%0Areg2.fit(x%2Cy)%0Aprint(reg2.a_%2C%20reg2.b_)%0Ay_hat2%20%3D%20reg2.predict(x)%0Aplt.scatter(x%2Cy)%0Aplt.plot(x%2Cy_hat2%2Ccolor%3D'red')%0Aplt.show()%0A%60%60%60%0A!%5B3dcac71e533c1473d83a865554a90f83.png%5D(en-resource%3A%2F%2Fdatabase%2F1312%3A0)%0A*%E6%80%A7%E8%83%BD%E6%B5%8B%E8%AF%95%EF%BC%8C%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E5%90%91%E9%87%8F%E5%8C%96%E8%BF%90%E7%AE%97%E6%AF%94%E7%BA%BF%E6%80%A7%E8%BF%90%E7%AE%97%E5%BF%AB%E7%BA%A650%E5%80%8D*%0A%60%60%60python%0Am%20%3D%201000000%0Abig_x%20%3D%20np.random.random(size%3Dm)%0Abig_y%20%3D%20big_x%20*%202.0%20%2B%203.0%20%2B%20np.random.normal(size%3Dm)%0A%25timeit%20reg1.fit(big_x%2C%20big_y)%0A%25timeit%20reg2.fit(big_x%2C%20big_y)%0A%0A%3E%3E%3E%0A962%20ms%20%C2%B1%2060.5%20ms%20per%20loop%20(mean%20%C2%B1%20std.%20dev.%20of%207%20runs%2C%201%20loop%20each)%0A25.9%20ms%20%C2%B1%20286%20%C2%B5s%20per%20loop%20(mean%20%C2%B1%20std.%20dev.%20of%207%20runs%2C%2010%20loops%20each)%0A%3E%3E%3E%0A%60%60%60%0A%23%23%23%20%E5%9B%9E%E5%BD%92%E7%AE%97%E6%B3%95%E7%9A%84%E8%AF%84%E4%BB%B7%0A%60%60%60chart%0A%2C%E8%AE%AD%E7%BB%83%E6%95%B0%E6%8D%AE%2C%E6%B5%8B%E8%AF%95%E6%95%B0%E6%8D%AE%0A%E6%95%B0%E6%8D%AE%E9%9B%86%2C80%2C20%0Aa%0A%0Atype%3A%20pie%0Atitle%3A%20%E6%A8%A1%E5%9E%8B%E7%9A%84%E5%9F%BA%E6%9C%AC%E7%BB%84%E6%88%90%0Ax.title%3A%20%E6%80%BB%E6%95%B0%0Ay.title%3A%20%E6%95%B0%E6%8D%AE%E9%9B%86%0Ay.suffix%3A%20%24%0A%60%60%60%0A**%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E7%AE%97%E6%B3%95%E7%9A%84%E8%AF%84%E6%B5%8B**%0A%24%24%5Cfrac%7B1%7D%7Bm%7D%20%5Csum%5Em_%7Bi%3D1%7D(y_%7Btest%7D%5E%7B(i)%7D-%5Chat%20y_%7Btest%7D%5E%7B(i)%7D)%5E2%24%24%0A%E5%9D%87%E6%96%B9%E8%AF%AF%E5%B7%AEMSE(*Mean%20Squared%20Error*)%0A%24%24%5Csqrt%7B%5Cfrac%7B1%7D%7Bm%7D%20%5Csum%5Em_%7Bi%3D1%7D(y_%7Btest%7D%5E%7B(i)%7D-%5Chat%20y_%7Btest%7D%5E%7B(i)%7D)%5E2%7D%3D%5Csqrt%7BMES_%7Btest%7D%7D%24%24%0A%E5%9D%87%E6%96%B9%E6%A0%B9%E8%AF%AF%E5%B7%AERMSE(*Root%20Mean%20Squared%20Error*)%0A%24%24%5Cfrac%7B1%7D%7Bm%7D%20%5Csum%5Em_%7Bi%3D1%7D%7Cy_%7Btest%7D%5E%7B(i)%7D-%5Chat%20y_%7Btest%7D%5E%7B(i)%7D%7C%24%24%0A%E5%B9%B3%E5%9D%87%E7%BB%9D%E5%AF%B9%E8%AF%AF%E5%B7%AEMAE(*Mean%20Absolute%20Error*)%0A%24%24R%5E2%3D1-%5Cfrac%7BSS_%7Bresidual%7D%7D%7BSS_%7Btotal%7D%7D%24%24%0A%0A%24%24R%5E2%3D1-%5Cfrac%7B%5Csum(%5Chat%20y%5E%7B(i)%7D-y%5E%7B(i)%7D)%5E2%7D%7B%5Csum(%5Coverline%20y%5E%7B(i)%7D-y%5E%7B(i)%7D)%5E2%7D%24%24%0A%E8%A1%A1%E9%87%8F%E4%BA%86%E8%87%AA%E5%B7%B1%E7%9A%84%E6%A8%A1%E5%9E%8B%E6%B2%A1%E6%9C%89%E4%BA%A7%E7%94%9F%E9%94%99%E8%AF%AF%E7%9A%84%E6%8C%87%E6%A0%87%0A*%20%24R%5E2%20%3C%3D1%24%0A*%20%24R%5E2%24%20%E8%B6%8A%E5%A4%A7%E8%B6%8A%E5%A5%BD%E3%80%82%E5%BD%93%E6%88%91%E4%BB%AC%E7%9A%84%E9%A2%84%E6%B5%8B%E6%A8%A1%E5%9E%8B%E4%B8%8D%E7%8A%AF%E4%BB%BB%E4%BD%95%E9%94%99%E8%AF%AF%E6%97%B6%EF%BC%8C%24R%5E2%E5%BE%97%E5%88%B0%E6%9C%80%E5%A4%A7%E5%80%BC1%24%0A*%20%E5%BD%93%E6%88%91%E4%BB%AC%E7%9A%84%E6%A8%A1%E5%9E%8B%E7%AD%89%E4%BA%8E%E5%9F%BA%E5%87%86%E6%A8%A1%E5%9E%8B%E6%97%B6%EF%BC%8C%24R%5E2%24%20%E4%B8%BA0%0A*%20%E5%A6%82%E6%9E%9C%24R%5E2%3C0%24%2C%E8%AF%B4%E6%98%8E%E6%88%91%E4%BB%AC%E5%AD%A6%E4%B9%A0%E5%88%B0%E7%9A%84%E6%A8%A1%E5%9E%8B%E8%BF%98%E4%B8%8D%E5%A6%82%E5%9F%BA%E5%87%86%E6%A8%A1%E5%9E%8B%E3%80%82%E6%AD%A4%E6%97%B6%EF%BC%8C%E5%BE%88%E6%9C%89%E5%8F%AF%E8%83%BD%E6%88%91%E4%BB%AC%E7%9A%84%E6%95%B0%E6%8D%AE%E4%B8%8D%E5%AD%98%E5%9C%A8%E4%BB%BB%E4%BD%95%E7%BA%BF%E6%80%A7%E5%85%B3%E7%B3%BB%E3%80%82%0A%3Esklearn%E4%B8%AD%E7%9A%84%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E7%AE%97%E6%B3%95%E4%B8%AD%E7%9A%84score%E5%B0%B1%E6%98%AF%E4%BD%BF%E7%94%A8%E7%9A%84%24R%5E2%24%E8%AF%84%E4%BC%B0%E6%A0%87%E5%87%86%0A%0A%23%23%23%20%E5%A4%9A%E5%85%83%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%0A!%5Bcc1b5dbe59536418d2485f61e2d2a295.png%5D(en-resource%3A%2F%2Fdatabase%2F1314%3A0)%0A%E6%95%B0%E6%8D%AE%E6%9C%89%E5%A4%9A%E5%B0%91%E7%89%B9%E5%BE%81%E6%9C%89%E5%A4%9A%E5%B0%91%E7%BB%B4%E5%BA%A6%EF%BC%8C%E5%B0%B1%E4%BC%9A%E6%9C%89%E5%A4%9A%E5%B0%91%E4%B8%AA%24%5Ctheta%24%E8%A1%A8%E7%A4%BA%EF%BC%8C%24%5Ctheta_0%24%E5%B0%B1%E6%98%AF%E6%88%AA%E8%B7%9D%0A**%E7%9B%AE%E6%A0%87**%20%E6%89%BE%E5%88%B0%24%5Ctheta_0%24%20%2C%20%24%5Ctheta_1%24%2C%24%5Ctheta_2%24%20%2C%20...%2C%24%5Ctheta_n%24%20%E4%BD%BF%E5%BE%97%24%5Csum%5Em_%7Bi%3D1%7D(y%5E%7B(i)%7D-%5Chat%20y%5E%7B(i)%7D)%5E2%24%E5%B0%BD%E5%8F%AF%E8%83%BD%E5%B0%8F%0A%20%E5%8D%B3%24(y-X_b%20%5Ccdot%20%5Ctheta)%5ET(y-X_b%20%5Ccdot%20%5Ctheta)%24%E5%B0%BD%E5%8F%AF%E8%83%BD%E5%B0%8F%0A%20%E5%A4%9A%E5%85%83%E7%BA%BF%E6%80%A7%E5%9B%9E%E5%BD%92%E7%9A%84%E6%AD%A3%E8%A7%84%E6%96%B9%E7%A8%8B%E8%A7%A3(*Normal%20Equation*)%0A%20%24%5Ctheta%3D(X%5ET_bX_b)%5E%7B-1%7DX%5ET_by%24%0A%20%E9%97%AE%E9%A2%98%EF%BC%9A%E6%97%B6%E9%97%B4%E5%A4%8D%E6%9D%82%E5%BA%A6%E9%AB%98%20%24O(n%5E3)%24%E4%BC%98%E5%8C%96%24O(n%5E%7B2.4%7D)%24%0A%20%E4%BC%98%E7%82%B9%EF%BC%9A*%20%E4%B8%8D%E9%9C%80%E8%A6%81%E5%AF%B9%E6%95%B0%E6%8D%AE%E8%BF%9B%E8%A1%8C%E5%BD%92%E4%B8%80%E5%8C%96%E5%A4%84%E7%90%86%0A%20%20%20%20%20%20%20%20%20*%20%E5%85%B7%E6%9C%89%E5%BC%BA
    Win a contest, win a challenge
  • 相关阅读:
    Samba 基础搭建
    HBuilder 打包流程和遇到的坑
    js 字符串查找相同字母最长子串
    web前端简单的H5本地存储
    rem响应式JS
    JS洗牌算法
    Js 常用正则表达式
    JS_DOM_鼠标、键盘事件合集
    SE 2014年4月3日
    SE 2014年4月2日
  • 原文地址:https://www.cnblogs.com/pandaboy1123/p/10245549.html
Copyright © 2020-2023  润新知