How to send Datama notifications by webhook via an orchestrator (GA4 BQ GCP example)?

Datama can be used to send messages containing analyses or alerting, but sometimes the data is not available at a fixed time. This is the case, for example, with GA4 interday data for web players. 

In such cases, Datama enables notifications to be sent from an external tool, via a webhook system. The external tool, typically an orchestrator (in Google Cloud Platform it’s Workflow, but these operations can be carried out in tools such as Apache Airflow, AWS Step Functions or Azure Logic Apps), will call the webhook which will trigger the analysis and dispatch of the message at the right moment, precisely when the data is ready.

In this example, we’ll follow the procedure for setting up exports in Datama based on this method, the “webhook” (an alternative option to “scheduled”). The aim is to trigger the export after an event in the datawarehouse (e.g. table update, creation, etc.). This is particularly useful in the case of waiting for GA4 data updates that don’t arrive at a fixed time. If you’re not familiar with the problem, I invite you to read this article on the subject.

In our example, we’ll follow the procedure for triggering the query after the GA4 data has been updated from the event_intraday table to the event table.

A. Create your workbook in Datama

A prerequisite for what we are trying to get is to have an accurate workbook in Datama that you will would like to trigger using a webhook. This will be typically in our case a Datama use case sourced on Big Query GA4 tables that we would like to make sure are up to date. 

Once created, you can then create an export, and then get a webhook.

B. Define the workflow

Before you can automate the sending of Datama exports at the right time, it’s essential to set up a workflow. This will be activated on a trigger: as soon as a specific event occurs in your datawarehouse, it will automatically call the Datama API to execute the export. In our case, we’ll set up a webhook based on the deletion of the event_intraday table, ensuring that data is only sent once it has been fully updated. Follow these steps to efficiently set up your webhook in BigQuery.
The first step is to create a workflow in your datawarehouse, the purpose of which is to trigger the Datama export API call via the webhook after a certain action, in our case the insertion of event_intraday data in the event_ table.
To do this, go to Bigquery

Step 1: Open the Workflows menu

  • In the navigation menu (top left), go to Worflows > Worflows or search for “Worflows ” in the search bar.

Step 2: Create a new workflow

  • Select the “CREATE” button to create a new workflow
  • Define its region, which must correspond to the region of the Bigquery project you wish to work on.
  • In “Call Log Level”, select “All calls”; calls will be filtered later in the “Trigger” section.
  • You can add labels to your webhook, for example with a Type=“webhook-job” label and an App=“datama” label.

Step 3: Define the workflow trigger

  • In the Triggers section at the bottom of the page, select “Add New Trigger” > “Eventarc”.
  • Give it a name and select “BigQuery” as Event Provider
  • For the event type, we’ll base ourselves on the deletion of the event_intraday table. In the sequence, Bigquery creates a new partitioned table for the new date, loads the data into this new table and then deletes the event_intraday table once all the data has been loaded. So in the sequence, deleting this table ensures that all the data is loaded, which is why we’re going to rely on it. So we’ll choose google.cloud.bigquery.v2.TableService.DeleteTable as the event type.
  • Then, for the resource part, there are two possible options:
    • Either select “Any resource” to see all table deletions and find the right deletion in the trigger history, then select the “Path Pattern” option and choose the right pattern.
    • Or, select “Path Pattern” directly and find the right event in the logs (see step 4 (optional) to find the logs).
      In our case, the Path Pattern will look like this: projects/NAME_OF_PROJECT/datasets/analytics_XXXXXXXXX/tables/*
  • Finally, select the region corresponding to the project region.

If you don’t see a webhook triggered in the history even though there have been table deletions in the dataset, I suggest you either select option 1 in the “Resource” section to ensure that the path pattern is the correct one. Otherwise, it may be a region problem.

Step 4 (optional): Retrieve logs

To access the logs, you’ll need to open a second Google Cloud Console tab.

  • In the navigation menu (top left), go to Logging > Logs Explorer.
  • In Logs Explorer, set a filter to target BigQuery activity.
  • You can use the following advanced query to filter logs of changes to a specific table, for example:
    • resource.type=“bigquery_resource” (Limits results to BigQuery resources)
    • resource.labels.dataset_id=“NOM_DU_DATASET” (Filters the name of the dataset containing the table)
    • resource.labels.table_id=“NOM_DE_LA_TABLE” (Filters by table name)

Step 5: Define the webhook to call

  • Once the trigger has been defined, move on to the next step. For those who are not familiar with the code, don’t worry, we’ll give you the few lines you need to add:
				
					main:
  params: [event]
  steps:
    - callDatama:
        call: http.get
        args:
          url: 'https://api.prep.datama.io/v1.0/webhook/XXXXXXXXXXXXXXXXXXXXXXX'
        result: response
    
    - returnCallOutput:
        return: '${response}'

				
			
  • The only field you’ll need to modify is “url”, which you can copy from Datama’s export block, from the field webhook url, click on it and it will be copied to your clipboard.
  • Press Deploy and your webhook is ready for testing.

Step 6: Finalize workflow

  • Finalization consists in making sure that the webhook works properly.
  • First, let’s make sure the script is working.
    Go to your webhook and select “Execute” from the top menu to run your script manually:
  • Run the program to make sure everything’s working properly, and you’ll have:
    If this is successful and you receive your Datama export, the script is functional.
  • To do this, go to the dataset defined in your path pattern, create an empty table and delete it. To simulate the deletion of a table, you should see a call to the webhook appear on the workflow details page and receive an export.

If these two tests are successful, your webhook is functional.

C. (optional) Update Datama workbook

Please note that the webhook URL contains information specific to the export you’ve defined in Datama Prep. This implies the following: every time you change even a single parameter in Datama, the webhook address is modified.

It is therefore necessary to copy the new link in Datama Prep and replace it in the code we defined in step 5. The new link to be pasted must replace the one present in the following location: “url: ‘new webhhook URL to be pasted’ ”.

Conclusion

By integrating a webhook into your BigQuery workflow, you optimize the sending of Datama exports by triggering them precisely at the right time, without depending on fixed schedules. This approach ensures that your analyses are based on fresh, complete data, especially for GA4. Thanks to meticulous configuration-from trigger definition to workflow validation-you secure an automated, reliable and responsive process.

In short, this method enables you to make more efficient and relevant use of your data.

Share the Post:

Subscribe to our newsletter