Applied ML 2020 - 09 - Model Evaluation and Metrics

Sdílet
Vložit

Komentáře • 3

  • @slapfighting
    @slapfighting Před 3 lety +7

    This course is the gold standard of Machine learning

  • @ryanglambert
    @ryanglambert Před 4 lety +2

    Recall at Precision XX%

  • @Han-ve8uh
    @Han-ve8uh Před 3 lety

    1. 31:40 threshold at 0.3 is chosen for predict_proba, how do we know what threshold value to use when trying to threshold decision_function? There is no longer a 0-1 range but goes from negative to positive infinity. But i see at 56:20 average_precision_score and roc_auc_score can work on svc.decision_function. Are these 2 metrics selecting threshold values based on the predicted values? (like how they work on predict_proba output)
    2. 1:02:48 says top 1 needs probabilities, does this mean decision_function will not work? Seems like any measure that ranks the predictions allows generating predicted classes from varying thresholds so decision_function should work too?
    3. 1:02:58 says bottom one works for any kind of uncertainties. What does the "uncertainties" refer to?