How to Schedule Cloud Function Triggers using Cloud Scheduler?

 
Share
 
 
 

Cloud Functions 

This an event-driven fully managed compute platform provided by Google Cloud Platform, here event-driven means it is only triggered when an event being watched is fired, and you are only billed while your function is running.  so, if your function is idle, the cost is zero. it is easy to setup, just write your logic in any of the supporting languages and trigger it via HTTP requests or File uploads to cloud storage or Events and Pub/Sub and fire base, it offers auto scaling, built-in security, and you can also get all the system usage metrics and logs for free as it’s a fully managed service.

Cloud Scheduler 

Cloud Scheduler is a fully managed  cron job scheduler provided by Google Cloud Platform, it allows you to schedule virtually any job like batch processing, big data jobs, cloud infrastructure operations and many more, you can automate almost everything like  Maximum retry attempts, Max retry duration, Minimum and Maximum backoff duration (time to wait before retrying a job after it fails)in case of failure to reduce intervention, it comes with many supported targets like App Engine, Cloud Pub/Sub, and HTTP endpoints which allows jobs to trigger Compute Engine, Google Kubernetes Engine, and other on-premises resources.

In this blog let’s deploy a python function on Cloud Functions to copy a blob from cloud storage bucket and copy it another other with time stamp and trigger the function for every hour using Cloud Scheduler, python code I am using is given below.

rename.py
 
from google.cloud import storage
import datetime

def rename_blob():
    time = datetime.datetime.now().strftime("%H%M%S")
    bucket_name = "aksshaay"
    blob_name = "aksshaay.txt"
    destination_bucket_name = "aksshaay"
    destination_blob_name = "aksshaay"+time
  

    storage_client = storage.Client()

    source_bucket = storage_client.bucket(bucket_name)
    source_blob = source_bucket.blob(blob_name)
    destination_bucket = storage_client.bucket(destination_bucket_name)

    blob_copy = source_bucket.copy_blob(
        source_blob, destination_bucket, destination_blob_name
    )

    return(
        "Blob {} in bucket {} copied to blob {} in bucket {}.".format(
            source_blob.name,
            source_bucket.name,
            blob_copy.name,
            destination_bucket.name,
        )
    )

And the requirement.txt will have the following

google-cloud-storage==2.5.0

 

On your GCP console search for Cloud Functions API and enable it, then go to Cloud Function to create a new Function, select product generation and Region (Comparision of generations is given below)

 

GCP offers 2 product versions (1st and 2nd generation) for Cloud Function service basic comparison between both versions.

 

Select authentication for HTTPS trigger, you can also select any event trigger.

 

Then configure Runtime, build and Connections as required,

 

Then select the Runtime you want to use also select how you want to provide source code and give entry point (exported function) i.e., function that should be called when triggered, to configure your code please follow the format given by cloud function when you select a runtime, then deploy it.

 

Once it’s deployed successfully you can use HTTPS link to invoke the function, as mentioned you will get all the metrics without need of installing any agent or paying extra, you will only pay for what you use.

 

Configure Cloud Scheduler to Trigger Cloud Function on a Specific Time Interval

Go to cloud scheduler on your cloud console to Create a Cron Job, give a name and select region and time zone, configure the job frequency, if are new to cron jobs refer this blog to learn about cron jobs and setting a cron expression.

Then continue to configure execution, set the target type to http, give the URL copied from Cloud Functions and set the http method,

you can also retry configuration (when a job is not completed successfully it is retried according to the retry configuration you set), in retry configuration you can set Maximum retries, max duration, maximum backoff and minimum backoff.

 

Please Contact our technical team for any offshore infrastructure management services websiteLinkedIn


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *