• Local weighted regression


    Local weighted regression(we need to find theta which can minimize this formular for each prediction) 

    Local weighted regression is very costly. This algorithm will train the theta vector while predicting each time. 

    Besides Local weighted regression is kind of parameteric learning algorithm, which means the number of paramters is not fixed.

    Adding weight in this way : weight gives higher weights for the part of training samples which are close to the target input, and gives lower weights for the other part of training samples which are far from the target input. Approximately we can ignore the effect of lower weighted training samples, only consider the higher weighted samples, which transforms the original problem into a local linear regression problem. Intuitively in this way would lead to a good prediction. 

    As Andrew Ng mentioned, KD-tree may help improve the efficiency of this problem. Surely is the fact, that KD-tree is pretty good at finding the nearest neighbers.

    It would always be not good to use regression algorithms to sovle classification problems. 

  • 相关阅读:
    jQuery 从无到有.一天完成.
    JavaScript从无到有(一天完成)
    HTML(第一篇)
    前端认识
    三元表达式,列表推导是,字典生成式
    ORM之youku项目小练习(上)
    高逼格壁纸
    pymysql 基操全套
    怎么学好编程?
    mysql 事务
  • 原文地址:https://www.cnblogs.com/flytomylife/p/3087485.html
Copyright © 2020-2023  润新知