As you saw earlier in this course, there are two general patterns for ETL workflows. Event triggered, aka push, as in I push a new file to GCS and then my workflow kicks off, or pull, which is where airflow at a set time could look in your GCS folder and take all the contents that are found there for its scheduled workflow run. We can use Cloud Functions to create our event-driven or push architecture workflow. I mentioned triggering on events within a GCS bucket, but you can also trigger them based on HTTP requests, pub/sub topics, Firestore, Firebase, and more as you see here. Generally, push technology is great when you want to distribute transactions as they happen. Things like stock tickers and other types of financial institution transactions are very important when it comes to push technology. How about disasters or other notification? Again, push is important. For ML workflows where your upstream data doesn't arrive at a regular pace, like get all the transactions at the end of each day, consider experimenting with a push architecture. Your final lab, since it's based on regular Google Analytics news article data, will be a pull architecture. But I've added in an optional lab for you to get practice with Cloud Functions and event-driven workflows for those interested. So, let's talk through some of those more pieces now. For example, let's assume we have a CSV file or a set of files loaded to GCS, so we'll choose a Cloud Storage trigger for our function. Then, we specify an event type. Here is we finalize or create new files. Then, a bucket to watch. As part of a cloud function, well, we need to create an actual function, which is written in JavaScript and that's what we want to called. The good news is that most of this code for triggering automatic airflow dags in a function is all boilerplate for you to copy from as a starting point. Here, we specify a name for our function called triggerDag. Then, we tell it where your airflow environment is to be triggered, and which dag in that airflow environment. In this case, is looking for a dag called GcsToBigQueryTriggered. Keep in mind, you could have multiple workflows or multiple dags in a single airflow environment. So, be sure you trigger the correct dag name. Then, we have a few constants that are provided which construct the airflow URL, then we're going to trigger a post request too, as well as who's making the request, and what the body of the request is. Lastly, the trigger dag function makes the actual request against the airflow server to kick off your workflow dag run. Once you've got the cloud function code ready inside your index.js file and the metadata about that function inside of package.json, which by the way just contains your code dependency and versioning information, you still need to specify in your Cloud Function which function you actually want executed. In this case, we created one called triggerDag. So, you just copy that down. I'll save you about 20 minutes of frustration and tell you that the function to execute box is case sensitive, so the capital letters D-A-G is a different function than capital D, lowercase a, lowercase g. There are several advanced options that you can specify for your cloud function like adjusting the default timeout for the function trigger to be made. If you want your function to automatically retry in code failure. For next lab, we'll just leave these at the default values. Generally, you'd only want to turn on retry on failure. If you have some short-lived to transient bugs in your function code, like if your function was dependent on a less than stable outside system. There you have it, your Cloud Function has been created, and it's actively watching your GCS bucket for file uploads. But how can you be sure that everything's working as intended? For that, check out our next topic on monitoring and logging.