• Performance Metrics for Binary Classification Problems Cheatsheet


    This article is merely for a quick recap of Machine Learning Knowledge, which can not be served as a tutorial.
    All rights reserved by Diane(Qingyun Hu).

    Prerequisites

    TP: True Positive
    FP: False Positive
    TN: True Negative
    FN: False Negative

    Recall

    = Sensitivity = TPR(True Positive Rate)
    egin{equation}
    Recall = frac{TP} {TP + FN}
    end{equation}

    Precision

    egin{equation}
    Precision = frac{TP} {TP + FP}
    end{equation}

    Accuracy

    egin{equation}
    Accuracy = frac{TP + TN} {TP + FP +TN + FN}
    end{equation}

    F1 Score

    egin{equation}
    F1 Score = frac{2 * Recall * Precision} {Recall + Precision}
    end{equation}

    Specificity

    egin{equation}
    Specificity = frac{TN} {TN + FP}
    end{equation}

    FPR(False Positive Rate)

    = 1 - Specificity
    egin{equation}
    FPR = frac{FP} {TN + FP}
    end{equation}

    ROC Curve

    x-axis: FPR ( = 1 - Specificity )
    y-axis: TPR ( = Recall )

    AUC (Area under the ROC Curve)

    The bigger the size of AUC is, the better.

  • 相关阅读:
    upcoj 2169 DP
    hdu3415 单调队列
    hdu4417(树状数组)(线段树)(划分树+二分)
    poj3264 线段树水题
    STL Map hdu1004,1075,1263
    hdu1166线段树水题
    <<<<<<<<<用来存代码哒!!!!>>>>>>>>>>>>
    jQuery
    apache配置php
    linux关机、重启命令
  • 原文地址:https://www.cnblogs.com/DianeSoHungry/p/11288143.html
Copyright © 2020-2023  润新知