Deploying Cloud Function using Google Cloud Build
Vložit
- čas přidán 19. 07. 2024
- This video helps you to learn Google cloud build and deploy Cloud Function using Google Cloud Build.
GIThub: github.com/cloudadvocate/goog...
Important Link: / awesome-cloud-build-%e...
Content:
00:00 Intro
00:22 Scenario/Use-case
02:02 What is Cloud Build?
02:22 Demo
17:18 Demo with trigger from GitHub
23:35 Why not Jenkins?
----------
- Associate Cloud Engineer Study Guide: www.cloudadvocate.net/p/assoc...
- Google Cloud Playlist: • Learn Google Cloud
----------
- Questions? Thoughts? Disagreements? Tell us here in the comments.
----------
- PLEASE SUBSCRIBE! / @cloudadvocate
Oh and follow me!
Facebook: / multicloudguy
Twitter: / multicloudguy - Věda a technologie
Thanks. It was very detailed and clear that I could now get the whole picture which I was not having after I studied the google cloud build documentation!
Awesome! Very well explained 👍 👍
Thanks!
Thanks. Still useful. but now google has gen 2 function. Is there a way to setup gen 2 function using cicd?
Thank You for amazing content. Could please let us know how to integrate Jenkins with App Engine. For GKE we have plugin and creaking Json key for service account then inegrating however for App engine we don't have any plugin. Could you please suggest something. Thank You!
Thanks for this. But I see the first deployed cloud function name is “sample_function” & the other one via Cloud Build trigger as “hello_world” which was a completely new deployment. Believe if we have the function name as “sample_function” in the yaml it will recreate that specific function as part of CI/CD. Please, Correct me if am wrong.
It seems like the --allow-unauthenticated parameter isn't working properly, since "Allow unauthenticated" flag is not shown within the main functions menu on GCP Console. How can I expose the API and give public access? @CLOUD ADVOCATE
can you tell what to write yml files for config schema for cloud build and cloud function?
Thanks for the great video! This really helped me with getting started. I have a question though about the Cloud Build trigger. It gives three options for events that invoke a trigger "Push to a branch", "Push new tag", and "Pull request". Now, let's say that I have a repository with many Cloud Functions. I make an edit on one function and I trigger the event. But I don't want every single cloud function in that repository to re-deploy. Any ideas?
Very good question. Yes you can accomplish multiple ways such as ignoring the files to trigger any builds. For example: You can only invoke build if certain file is changed. cloud.google.com/cloud-build/docs/automating-builds/create-manage-triggers has very good info on that. Please try this in ur non prod :). Also you can combine this with branching strategy of GIT. I found this link helpful too stackoverflow.com/questions/61011558/in-gcp-through-cloudbuild-how-can-i-ensure-that-only-those-steps-get-triggered
Good question indeed. What if you have all cloud functions in main.py 🤔 then technically that will trigger a rebuild on all functions for every single change.
Amazing video... just saw the video..now I'll try once 🙏
Pls provide GitHub url
Thanks Pratik, updated in the description.
I am facing issue will running the gcloud builds command
What's the error mate
Excellent video. I have Doubt.
Is there any way we can roll back to previous State. Current I am using Terraform to create cloud function and trigger job with Teamcity flow is working fine. But each time I have destroy existing infra and create new Terraform plan. Is there any way in cloud build we can store state of cloud function and rollback to previous State if our function is not working properly.
looking forward for more videos
Hi Mayur, that's a good scenario. Let me think about that. One way is maintaining the versioning in ur GIT and and then create a new function and route the traffic to that function, pretty much like BG. But other than that, I don't think CF supports any versioning. Let me see if if there any other way.
Nice; i make one project that have more than one function. And i do steps. But when i get the cloud build trigger, the build deploy all the functions agin. How to create just one project for more then one cloud function, that just deploys the functions that i have change?
How you uploaded all files in cloud shell directory... This question may be silly
Hi @
Cloud Advocate
I am not able to get the free tier for gcp .. could you please help me
How we can configure cloud build using terraform module ?
This bucket how it is created ?
What will be the location of this bucket ?
Hi GK, Thanks for videos on cloud, could you please suggest if I need to start cloud career with cloud developer associate or azure architect, could you please suggest, thanks!!
You mean in Azure? Or GCP? I am not much familiar with Azure at the moment.
can we do roll back using cloud build.
Very informative videos
I am facing one issue (gcloud.builds.submit) HTTPError 403: AccessDeniedAccess denied.cloudbuildfn@awesome-pilot-289111.iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object.
Even service account is having "Storage Admin" role
Hi Varun, did you check permissions of cloud build?
@@CloudAdvocate Service account has following roles Cloud Build Service Account
,Cloud Functions Developer
and Service Account User.
One more interesting error on command line is still being shown awesome-pilot-289111.iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object." but build was triggered for 1 min though it is failing with following error
Function failed on loading user code. Error message: File main.py that is expected to define function doesn't exist
Detailed stack trace:
Traceback (most recent call last):
File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker_v1.py", line 315, in check_or_load_user_function
_function_handler.load_user_function()
File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker_v1.py", line 190, in load_user_function
spec.loader.exec_module(main_module)
File "", line 724, in exec_module
File "", line 859, in get_code
File "", line 916, in get_data
FileNotFoundError: [Errno 2] No such file or directory: '/user_code/main.py'
Copying all source in same folder and executing following command gcloud builds submit , cloud is function is getting deployed and working fine but on command prompt i still see same error "cloudbuildfn@awesome-pilot-289111.iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object" which is kind of strange thing.
Kindly upload the Jenkins with gcp developer tutorials video's
Sure, will do.
Thanks for the video bro, it is very helpful. How can we get the email when build gets failed ?
Hi Manoj... thank you! Not the straight forward way like Jenkins but you can do it with pubsub. Give it a try cloud.google.com/cloud-build/docs/subscribe-build-notifications