• [coursera machine learning] Week 1


    1. machine learning 问题的分类:

    Supervised Learning: right answers given in samples

               Regression: continuous result

               Classification: discrete valued output

    Unsupervised Learning: learning about a dataset without correct answers

                Clustering: divide dataset into groups

                Non-clustering: separate different voices from a voice sample (cocktail party)

    2. Model Representation:

    training set -> learning algorithms -> hypothesis 

    x -> hypothesis -> y

    3. Cost Function:

    m is the number of samples

    4. Gradient Descent (not only for linear regression)

    n is the number of features

    minimization a function (ect. cost function)

    the alpha is learning rate

    all theta should be updated simultaneously.

    5. Normal Equation Formula

    comparison of gradient descent and normal equation formula.

    normal equation is faster with less features.

    gradient descent is faster with more features.

  • 相关阅读:
    2.完全背包问题
    1.01背包问题
    19.区间合并
    18.区间和
    16.数组元素的目标和
    15.最长连续不重复子序列
    14.差分矩阵
    1.注册七牛云账号
    1.1 linux查看系统基本参数常用命令
    图书管理增删改查&父子调用&前后端
  • 原文地址:https://www.cnblogs.com/Gryffin/p/6687973.html
Copyright © 2020-2023  润新知