End-To-End: No Code - AutoML in Vertex AI for ML Operations [notebook 02a]
Vložit
- čas přidán 25. 07. 2024
- An end-to-end workflow completely within the Vertex AI interface in the Google Cloud Console. We will use AutoML to train a machine learning model. A walkthrough of all the steps from connecting to data sources, training a model, evaluating the final model, deploying to an online endpoint and requesting predictions from multiple clients. A few deep dives along the way! This video follows the notebook 02a - Vertex AI - AutoML in GCP Console (no code).
GitHub Repository: github.com/statmike/vertex-ai...
The Notebook followed in this video: github.com/statmike/vertex-ai...
Timeline:
0:00 - Introduction
3:25 - Overview
5:08 - Start Walkthrough
6:38 - Create Dataset
7:40 - Discuss Types of Model for AutoML
11:12 - Select Source of Data
12:50 - Train Model with AutoML
16:35 - Q&A: What optimization objective to choose?
23:08 - Set Budget and run training
24:22 - Evaluate Model
30:43 - Deploy & Test (Endpoint)
36:32 - Review Endpoint
39:50 - Online Predictions
47:42 - Online Model Explanations
48:33 - Q&A: What are explanations?
51:35 - Review Explanations for Model
53:23 - Batch Predictions
56:56 - Q&A: When should I use the Console for AutoML?
58:43 - Wrap-up
Thank you! Finally, someone who explains the why to each concept. Thank you so much for supplying what I was missing to make the mental connections!
Great video Mike.. Nice explanation on performance matrix also. End to end walkthrough of AutoML
This was really quality. Thanks a lot. You made a superb job of explaining and hit the perfect balance of not too much and not too little.
Watched all the videos of the channel , very good Mike . I am so thankful to you .
I appreciate that! There should be more videos this Fall!
already moved to MLOps engineer role, the github repo and videos help me a lot. Thanks mike
This is a great resource, looking forwards to checking out your other videos. Thanks so much for putting these up.
Thank you Mike for the good overview!
Thank you for the playlist. Amazing quality. Just wanted to say that you are amazing and your content is appreciated.
Perfect way to onboard someone to Vertex AI, still. Thank you for the great work!
Thanks Mike for such great content. I’ve learnt a whole lot from this. Cheers!
Great detailed tutorial. Thanks a lot Mike. Please, keep up the good work!
Thank you for the encouragement!!!
Thanks Mike, I wish this tutorial was around when I first started trying to find my way around around Vertex AI. Whilst I enjoy finding out things by trial and error, the older I get the more I value my time and having someone to show me the first steps and explain what to watch out for is invaluable.
Thank you for acknowledging this. This is one of the reasons I decide to embark on this adventure!
@@statmike-channel Yes thanks a lot !
wonderful tutorials, thank you Mike
Thank you for making these videos, please continue to make more
Coming this Fall! Thank you for the postive feedback!
this is awsome and you're amazing Mike !
Perfect tutorial to understand GCP ML
Thank You! This motivates me to make more videos soon!
Love the way he introduced himself!!
Thank you. I really appreciate your effort. 😊
Awesome tutorial, thanks very much
Awesome. Thank you Mike :)
this is amazing. thank you mike!!!
😍
Thanks for the Video great stuff !!
bravo...
the content is brilliant. method is effective. larger "font_size" could have more convinient.
I liked the idea of end to end
Thank you. Great video
Thanks man, this helped a lot.
Yo! Love your stuff!
good job pal!
Great stuff! Enjoyed excellent explanations and following along with github repo clone. I would have a few questions. 1) since I changed a few lines in notebooks, what happens if I commit changes: is git going to try updating your repo? 2) I would like to see the underlying model architecture: how to see this? Perhaps answers coming in following vidoes.
Hi Mike! When you filming a new tutorial of no-code on the new AutoML Tabular training on Vertex AI Pipelines? Great content btw!
Don't get disheartened with the traffic you're getting or the the class imbalance of views vs. likes 😅, trust me many don't even know this playlist exists, plus many don't even know what MLops is.
I've been looking for it for like last 2 months and dunno why CZcams algo didn't put this up for me before. Would suggest adding hastags maybe.
Cheers 🤘👊🙌
Thanks for the tips! Working on a plan for promotion as the next series of video launches.
Great tutorial. Thanks for sharing. I have a question about how to create a batch prediction for the images. In your case you used the tabular data (where you used the BQ), but when I'm trying to make a batch prediction for images, it is asking me to provide JSONL format file. I'm a little confused providing JSONL file. Thank you.
Hi Farukh, I look forward to expanding these demos to also show text and image based ML. Until then I will try my best to answer you here in the CZcams comments. For an AutoML Vision Model through the console (similar to this video), the workflow ask for a name, the model, source path and destination path. The paths are to Google Cloud Storage buckets. The source is a .jsonl file in the storage bucket that has one line per image you want to predict. Each of these lines looks like {"content":"gs://path to image", "mimeType":"image/jpeg"} . Here is a link that may help: cloud.google.com/vertex-ai/docs/predictions/batch-predictions#batch_request_input
Thank you, just wanted to check - after deploying the endpoint in vertex, there is no need to go through extra steps to have the REST API endpoint, right? For aws sagemaker, there is an extra api gateway with lambda step.
Thank you
24:10 - i thought that would be all the video. Waiting one hour for the model to finish training
Mike you are F******* amazing
very flattering!! Not sure I can live up to this level of hype!
great 🎉
using automl do we know what models it is using behind the scenes?
I wish if you could make a video on creating pipelines using Vertex Ai pipeline
There are already a few videos in this series that incorporate some basic pipelines. The repository linked in the description also has more example of advanced pipelines. Let me know if there is any specific part of pipelines you would like to see featured!
I got a error making the training :( the traning job failed
I may have an astonishing project that does the impossible...I need a ML set up to take it to the next level... It took me forever to code the one of a kind tool but i really need a ML model to analyze it. I have not been able to get AI to work on it properly....probably because no one has EVER seen a data set like this .. i really need someone to help with this FULL NDA
Mike, thanks for video.just a question-for batch prediction,do we need to do “deploy and test “ first and then do batch predictions ?
Hello Priya, You can skip the deployment to an endpoint and do the batch predictions. The endpoint is not required for batch predictions and runs as a separate job. Hope this helps!
@@statmike-channel Thanks Mike for quick reply!
@@statmike-channel wow it does makes sense now!
what if the raw data has to go through preprocessing steps before being fed to the model?
Hi Tariq, Any time you train a model, custom or AutoML service, you are taking a snapshot in time of inputs = training data. If you preprocess those inputs then those preprocessing steps would need to be replicated for using the resulting model for predictions. If you choose a method with built in preprocessing, AutoML or custom with built in preprocessing, then the model will include preprocessing steps as part of the model that also get replicated during prediction. A good guide for reviewing the automatic preprocessing done by Vertex AI AutoML can be found at this link: cloud.google.com/vertex-ai/docs/datasets/data-types-tabular
How do you actually get the model? Like: What architecture does it use? What are the weights?
Thank you for the question. I have this notebook in the accompanying GitHub repo that shows a method for extracting the model type and hyperparameters from Cloud Logging: github.com/statmike/vertex-ai-mlops/blob/main/02%20-%20Vertex%20AI%20AutoML/02Tools%20-%20AutoML%20Cloud%20Logging.ipynb
It is also now possible to have more direction over AutoML by using AutoML Workflows. I will add this on my next update pass for the AutoML series. A good documentation page to read about this is: cloud.google.com/vertex-ai/docs/tabular-data/tabular-workflows/e2e-automl
Hello friend I am interesting in investing/trading in the stock market. Can you make another videos in how to train the platform in trading. Thanks
LOL Look at the tab he has open! omg cancel this guy.