Azure Data Factory (ADF)

You can use the Paradime APIs to integrate Bolt schedules executing dbt™️ within Azure Data Factory pipelines. Below is a guide on out to setup an ADF pipeline to trigger Bolt runs from ADF.

Requirements:

  1. Create a Bolt schedule in Paradime and set the configuration as OFF.

  2. Generate API credentials for your Paradime Workspace.

1. Setup an Azure Key Vault to store API credentials

To get started, you will want to create an Azure Key Vault and store the Paradime API key and Secret securely.

Select Secrets in the left Panel and then click on the Generate/Import option. Here you will be able to store the Paradime API credentials. Name the secret respectively:

  • paradimeApiKey

  • paradimeApiSecret

and add the API credentials generate for your workspace.

In the next step when loading the provided ADF template we will need:

  • the name of your Azure Key Vault

  • the name of your stored secrets name in the Key Vault

2. Create your ADF pipeline

First upload the template provided in this guide, this will create all the tasks, parameters and variables needed to execute your ADF pipeline and trigger a Paradime Bolt run via the API

Complete the configuration of your ADF pipeline by entering in the Parameter Value the required fields.

Required fields:

  • apiURL: you can get this when generating the API credentials in Paradime.

  • scheduleName: the Bolt schedule name you want to trigger from the ADF pipeline

  • keyVaultName: the name of the Auzre Key Vault where your API secrets are stored

When done, simply click on the Publish button and you can now test your ADF pipeline triggering a Paradime Bolt Schedule 🚀

Last updated