• 多分类器


     

    Multiclass Classification: One-vs-all

    Now we will approach the classification of data when we have more than two categories. Instead of y = {0,1} we will expand our definition so that y = {0,1...n}.

    Since y = {0,1...n}, we divide our problem into n+1 (+1 because the index starts at 0) binary classification problems; in each one, we predict the probability that 'y' is a member of one of our classes.

    y{0,1...n}h(0)θ(x)=P(y=0|x;θ)h(1)θ(x)=P(y=1|x;θ)h(n)θ(x)=P(y=n|x;θ)prediction=maxi(h(i)θ(x))

    We are basically choosing one class and then lumping all the others into a single second class. We do this repeatedly, applying binary logistic regression to each case, and then use the hypothesis that returned the highest value as our prediction.

    The following image shows how one could classify 3 classes:

    To summarize:

    Train a logistic regression classifier hθ(x) for each class to predict the probability that  y = i .

    To make a prediction on a new x, pick the class that maximizes hθ(x)

  • 相关阅读:
    多态与异常处理课后作业
    Java接口与继承作业
    大道至简第六章随笔
    大道至简第五章随笔
    Java数组课程作业
    大道至简第四章随笔
    String课后作业
    大道至简第三章随笔
    Java语法基础动手动脑实践
    大道至简-第二章 心得体会
  • 原文地址:https://www.cnblogs.com/ne-zha/p/7383177.html
Copyright © 2020-2023  润新知