13.4.1 Recursive Feature Elimination (L13: Feature Selection)

Sdílet
Vložit
  • čas přidán 6. 09. 2024
  • Sebastian's books: sebastianrasch...
    In this video, we start our discussion of wrapper methods for feature selection. In particular, we cover Recursive Feature Elimination (RFE) and see how we can use it in scikit-learn to select features based on linear model coefficients.
    Slides: sebastianrasch...
    Code: github.com/ras...

    Logistic regression lectures:
    L8.0 Logistic Regression - Lecture Overview (06:28)
    • L8.0 Logistic Regressi...
    L8.1 Logistic Regression as a Single-Layer Neural Network (09:15)
    • L8.1 Logistic Regressi...
    L8.2 Logistic Regression Loss Function (12:57)
    • L8.2 Logistic Regressi...
    L8.3 Logistic Regression Loss Derivative and Training (19:57)
    • L8.3 Logistic Regressi...
    L8.4 Logits and Cross Entropy (06:47)
    • L8.4 Logits and Cross ...
    L8.5 Logistic Regression in PyTorch - Code Example (19:02)
    • L8.5 Logistic Regressi...
    L8.6 Multinomial Logistic Regression / Softmax Regression (17:31)
    • L8.6 Multinomial Logis...
    L8.7.1 OneHot Encoding and Multi-category Cross Entropy (15:34)
    • L8.7.1 OneHot Encoding...
    L8.7.2 OneHot Encoding and Multi-category Cross Entropy Code Example (15:04)
    • L8.7.2 OneHot Encoding...
    L8.8 Softmax Regression Derivatives for Gradient Descent (19:38)
    • L8.8 Softmax Regressio...
    L8.9 Softmax Regression Code Example Using PyTorch (25:39)
    • L8.9 Softmax Regressio...
    -------
    This video is part of my Introduction of Machine Learning course.
    Next video: • 13.4.2 Feature Permuta...
    The complete playlist: • Intro to Machine Learn...
    A handy overview page with links to the materials: sebastianrasch...
    -------
    If you want to be notified about future videos, please consider subscribing to my channel: / sebastianraschka

Komentáře • 22