Python TensorFlow for Machine Learning - Neural Network Text Classification Tutorial
Vložit
- čas přidán 19. 06. 2024
- This course will give you an introduction to machine learning concepts and neural network implementation using Python and TensorFlow. Kylie Ying explains basic concepts, such as classification, regression, training/validation/test datasets, loss functions, neural networks, and model training. She then demonstrates how to implement a feedforward neural network to predict whether someone has diabetes, as well as two different neural net architectures to classify wine reviews.
✏️ Course created by Kylie Ying.
🎥 CZcams: / ycubed
🐦 Twitter: / kylieyying
📷 Instagram: / kylieyying
This course was made possible by a grant from Google's TensorFlow team.
⭐️ Resources ⭐️
💻 Datasets: drive.google.com/drive/folder...
💻 Feedforward NN colab notebook: colab.research.google.com/dri...
💻 Wine review colab notebook: colab.research.google.com/dri...
⭐️ Course Contents ⭐️
⌨️ (0:00:00) Introduction
⌨️ (0:00:34) Colab intro (importing wine dataset)
⌨️ (0:07:48) What is machine learning?
⌨️ (0:14:00) Features (inputs)
⌨️ (0:20:22) Outputs (predictions)
⌨️ (0:25:05) Anatomy of a dataset
⌨️ (0:30:22) Assessing performance
⌨️ (0:35:01) Neural nets
⌨️ (0:48:50) Tensorflow
⌨️ (0:50:45) Colab (feedforward network using diabetes dataset)
⌨️ (1:21:15) Recurrent neural networks
⌨️ (1:26:20) Colab (text classification networks using wine dataset)
--
🎉 Thanks to our Champion and Sponsor supporters:
👾 Raymond Odero
👾 Agustín Kussrow
👾 aldo ferretti
👾 Otis Morgan
👾 DeezMaster
--
Learn to code for free and get a developer job: www.freecodecamp.org
Read hundreds of articles on programming: freecodecamp.org/news
Thanks for watching everyone! I hope you enjoy learning from the examples in this course :)
What are the prerequisite for this video?
Excellent session! Thank you for covering every topic and showing practical implementation of LSTM.
Hi, I am very excited for this video, you are a very good teacher.
@@mfaiz6 My personal opinion but I would say you should have some level of knowledge of working with python. Be somewhat comfortable looping and iterating through data structures like dictionaries, lists, arrays, etc. and writing functions for basic tasks and printing/writing to console. You should also know and have basic usability of numpy arrays and pandas dataframes. From here, you can learn specific things you need by searching something you don't know via google or DDG as you need!
Damn, you're so cool.
This is exactly what I was searching yesterday! You're amazing! Thanks for this tutorial. :)
That was so well-explained and practical! Looking forward to more of these on other types of machine learning models! Thank you!
great content.
explained in layman terms without wasting time 👌🏻
you way of explaining is so good this was the first video i watched on Neural networks and iam already in love with it.
finally!! i have finally understood everything after a month of struggling to do so. thank you sooo much
20 minutes in and am all in. I teach students ML and Data Science, and i keep studying the same myself. The young lady in the video covered all the necessary basics, and did it so well i might end up suggesting the same video to my students on multiple occasions. And yeah, at the end of this video, i am going to her channel and subscribing. Keep up the good work
Really great video, great explanation of concepts in very easy/ layman terms. Well done!
Thanks so much Kylie, good coding tutorial and excellent, sharp run through ML theory!
Thanks again.
a reinforcement learning course please,please , please , really need it & you're so amazing at simplfying things and making them understand
You are so awesome! this is I am searching for! it is really help a lot! Thank you all you hard work and precious time!
Thank you for making this! Please make it a series if you can
Thank you so much for your brilliant tutorials and courses Kylie (please do more!!!)! Could you please recommend some books on the mathematics of machine learning (and books that you found useful when you dived into the subject).
I find your tutorial very interesting, very clear, and very convincing. My question: Also, is there a tutorial that shows the practical application of the model you created? - I would like to learn more about how this model can be practically used for evaluating and analysing new data.
Your analogy’s are awesome very easy to understand thanks
Sharing your knowledge it is invaluable. Thank you 1000 times
A great one, I love your mode of teaching, simple
It's new for me that COLAB things.
With it, I don't need deal with Python environment questions any more!!
Amazing good tool
Nice video, you really sparked interest in ML and are looking foward to future content! Keep it going!
Thank you once again Kylie!
⭐ Course Contents ⭐
⌨ (0:00:00) Introduction
⌨ (0:00:34) Colab intro (importing wine dataset)
⌨ (0:07:48) What is machine learning?
⌨ (0:14:00) Features (inputs)
⌨ (0:20:22) Outputs (predictions)
⌨ (0:25:05) Anatomy of a dataset
⌨ (0:30:22) Assessing performance
⌨ (0:35:01) Neural nets
⌨ (0:48:50) Tensorflow
⌨ (0:50:45) Colab (feedforward network using diabetes dataset)
⌨ (1:21:15) Recurrent neural networks
⌨ (1:26:20) Colab (text classification networks using wine dataset
Course created by Kylie Ying
Thanks a lot for this awesome video. It helped me a lot in my college project
you teach really well i am impressed seriously i mean it
@21:04 when kylie was explaining multiclass and binary classification with the example of hotdog, I first remembered Jian yang's app from Silicon Valley. I really liked that you put in a small clip of it.
Haha classic!!
This was a great video. My only questions from it would be:
1) How would you set these projects up outside of colab?
2) How do we utilize the model?
Thank you for the excellent overview!!!!
Thanks Kylie for explaining very clearly the concepts in different neural network architectures, the code part was also very interesting since I got to know for the first time about imbalanced learn library and about Dropout layer for dealing with overfitting! Besides, I guess we ran the model.evaluate before training the model to show the base case of randomly choosing between two labels yields accuracy of 0.5 (probability of random selection between two classes)?
Thanks Kylie!!! Awesome content.
Great, amazing and charming work, thank you.
Tutorials that go from start to finish from data to model *and* explain the surrounding concepts and theory.. those are good.
Maybe I should start including code too.. 🤔
very clearly explained
great job
You are great sister. You have helped me a lot with this tutorial. 😍
The Silical Valley insertion was really cool.
Thank you so much this viedio really make me understand ML easier than ever I learn about this topic
I never worked on machine learning, but I can easily follow and understand what is going on. Thanks for the crystal clear and great explanation. @KylieYYing.
Thank you so much Kylie!
Great lesson, love to see more of your
Thanx @Kylie for such wonderful tut's - how original and through, I really learned A LOT!
Anyway I have a quick question, after completing evaluation with test cases - is it possible (like other ML projects) passing real life data and get the answer?
Like, we build model with 'description' and 'variety' and per given 'description' can we predict possible 'variety'?
[04:39] Just to be clear, `NaN` is not a "none-type value" indicating that "no value [was] recorded [there]" -that'd be `undefined`. It stands for "not a number" and is the result returned from trying to do an operation that can only be done on an Int/Float (or something that will be coerced into an Int/Float) on a value that isn't an Int/Float; e.g., `4 * "dog"` in JS will return `NaN`. It means you tried to do something with a number that's irrational to do with an number. Another JS example: zero divided by zero.
Hi, I am very excited for your new amazing video, thanks , you are a very good teacher.
This is interesting to watch. Thank you!
you are awesome ! Very very clear explanation
it's learningggggg !!!! TENSORFLOW! 🔥🔥💕💕
Thank you very much for your tutorial!
Really awesome work!
i love these video, keep making it.
You are amazing! Thank you very much.
I enjoyed your tutorial Keep it UP Girl, Your ROCK 💪
I saw the thumbnail that was Kylie, so I gave it a Like already.
Code squad. Love it. 😊
Guys this is pure diamond 💎💎💎
Well explained. Thanks
Superb teaching!!!
not hot dog :D, this part is still round in my mind, and the funny part for helping me to grasp what is binary classification is
hope to see this next course about machine learning using python and tensorflow. and i want to ask, what the implemention in daily life about this course, thank you
Very informative thank you
You are a great teacher
this is really good video. watching
Great course!
OMG Kylie is here wow new machine learning course 😍
Excellent tutorial, There are two questions. 1. Can I use open-source large language models in your text classification code for analyzing a wine review dataset?. 2. If yes plz suggest me where and how i can change.
Just grateful thak you.
very good video, start practice wthi this watched till 13:00
Oh man, was fasting today and the example at around 20:00 with the hot dog, pizza, and ice cream had me dying😅
Was saved by the Silicon Valley clip😂
Some conceptual errors present in the tutorial. Scaling the data before splitting means the train dataset is informed about data from the test set which it is not supposed to know. Random oversampling prior to the split might also overestimate the performance of the model on the test dataset because of data duplication/leakage. In general, it's best to keep the test data separate before augmenting the training data.
Great video!!
Great tutorial
Thank you for a well crafted tutorial. My question is on what you did with the imbalanced dataset? Creating an artificial or synthetic data and use that as a basis for the ML model seems to be questionable to say the least. It feels like we are introducing a lie into the model for the sake of an artificial equal outcome and use that for prediction. I would be grateful if you can elaborate on that, or anybody else for that matter.
Thanks a million
awesome!
Love that intro 😂 😂
is the wine review also a feed forward neural net? cause it seemed like in the video you were alluding to it being a RNN?
Thanks kylie
Informative tutorial.
I am good the tutorial was straight forward.
I want to be as smart as "Kylie Ying" when I grow up. LMAO! 🤣🤣🤣
Same. :)
This tutorial can be called "Neural networks crash course with practice problem". Thank you!
Love it
Sorry if this sounds rude but what was the wine one for? Is it showing the accuracy of the reviews whether its high or low rated?
Thank you
1:36:40 Is it wise to set trainable=True in the embedding layer imported from the hub? Isn't the whole point that it is pre-trained?
Sorry I have a really dumb question: how did you share the colab notebook so that it's editable but modifications can't be saved?
Thanks for sharing, could you make tensorflow2 object detection retraining with existing classes(labels) and adding new class tutorial
Amazing thanks :) glad to see a girl on your channel doing a tutorial for NLP !
Nice tutorial btw
at 1:12:25 , feature scaling should be done after splitting into training & testing data in order to avoid information leakge
I think you could have used an « else » here :) 0:05
Great video !
We need Javascript TF tutorial as well. Thank you.
Hi, great tutorial but i think you have a mistake: you are leaking information from train to test. Both scaling and resampling must be done to the train and then to the test separately, not to the whole dataset 🙃
hey, @Kylie Ying in the diabetes model, you are having the number of neurons in first layer as 16, will it be a better option if it is 8 i.e length of feature vector. thanks.
Thank you. and Thank you.
I was expecting something like : tf.keras.layers.Input(shape=(8,))
Can we have custom plugin development in java using Eclipse tutorial from scratch .
Thanks in advance .
Great work thanks its so simplified.just WOW.
YEEAHHH KYLIE YING LADS AND GENTS!!
Thanks
thx 4 vid !~
if you have an error with the inputs shape when you evaluate the data just do this instead of what she did:
hub_layer = hub.KerasLayer(embedding, input_shape=[], dtype=tf.string, trainable=True)
CZcams wants me see this video z seen in my feed like ,10 times already
can i do ai ml model to train sensitive data does tensor-flow stores our data..?
The hotdog / not hotdog had me dying😅
1st example: When I tried this the first time I got almost the same accuracy, but when I restarted the kernel of the notebook and run everything again I got an initial accuracy of 65% instead of 35% and that accuracy varies b etween 60 and 70% in the next steps and finally drops to about 60% when evaluated on the test data (on multiple runs the best it got was 66% but the average is much lower)...
Is the notebook saving the model and updating on re-run causing overfitting or is it normal?
I believe the code randomly creates your training, validation, and test sets so the percentages of accuracy will be different between models (when you restart the notebook) because the data points used for the different sets will be different.
Thank you so much for this amazing content, can you make another to Federated learning
This would be awesome 🤩
can i use text classification to classify my users inputs and map this user inputs to nearly 10,000 products to automate the pricing of users entries instantly without needing a sales team ?
3rd comment
Thx sir for educating us 😊