7 1 Feature Expansions | Machine Learning

Sdílet
Vložit
  • čas přidán 7. 10. 2022
  • FEATURE EXPANSIONS AND DOT PRODUCTS
    What expansion should I use?
    This is not obvious. The illustrations required knowledge about the data that
    we likely won’t have (especially if it’s in high dimensions).
    One approach is to use the “kitchen sink”: If you can think of it, then use it.
    Select the useful features with an ‘1 penalty
    w‘1 = arg min
    w
    nXi=1
    f(yi; φ(xi); w) + λkwk1:
    We know that this will find a sparse subset of the dimensions of φ(x) to use.
    Often we only need to work with dot products φ(xi)Tφ(xj) ≡ K(xi; xj). This
    is called a kernel and can produce some interesting results.
    Find videos about :-
    #ArtificialIntelligence #ai #AI #DataScience #MachineLearning #DeepLearning #NeuralNetworks #ArtificialNeuralNetwork #ann #ConvolutionalNeuralNetwork #cnn #RecurrentNeuralNetwork #rnn #LongShortTermMemory #lstm #GatedRecurrentUnit #gru #ComputerVision #NaturalLanguageProcessing #nlp #Nltk #Spacy #Tensorflow #LinearRegression #LogisticRregression #KNearestNeighbour #knn #DecisionTree #RandomForest #SupportVectorMachine #svm #clustering #cluster #pca #ensemble #Sklearn #Python #Django #DjangoRestFramework
  • Věda a technologie

Komentáře •