Using Azure Pipelines

You can build your custom Continuous Deployment jobs using Azure Pipelines and Bolt APIs.

To use this feature it is required to have a production environment configured in Paradime.

ℹ️ Check our setup guide here based on your data warehouse provider.

Create a CD Bolt schedule

To start create a new Bolt schedule and make sure to add the deferred_schedule configuration. Depending on your intended behavior you can choose to defer to another production job or defer back to the last CD run.

The deferred_schedule_name set in the configuration should have at least one successful run available, so that Paradime can pick the manifest.json for state comparison.

Example schedule

  - name: continuous_deployment_run # the name of your CD job
      enabled: true # true to enabled this Turbo CI job to run on pull request
      deferred_schedule_name: hourly_run #the name of the bolt schedule where the CD job will look for the most recent successful run manifest.json for state comparison
    schedule: "OFF" # set the schedule configuration to not run on a schedule (to be used for PR only)
    environment: production #the environment used to run the schedule -> this is always production
      - dbt run --select state:modified+ #the dbt™️ command you want to run after the pull request is merged
    owner_email: "" #the email of the CD job owner_email

Generate API keys and find you workspace token

API keys are generate at a workspace level.

To be able to trigger Bolt using the API, you will first need to generate API keys for your workspace. Got to account settings and generate your API keys, make sure to save in your password manager:

  • API key

  • API secret

  • API Endpoint

  • Workspace token

You will need this later when setting up the secrete in Azure pipelines.

Generate API keysCompany & Workspace token

Create an Azure Pipeline

Now you will need to create a new azure-pipeline.yml file in your dbt™️ repository. Copy the code block below and enter the values required.

Example Azure pipelines configuration file
  - main # Update this to your default branch name

  - name: schedule_name
    value: <the schedule name set in Paradime> #example continuous_deployment_run
  - name: api_endpoint
    value: <the api endpoint generated in the previous step> #example
  - name: workspace_token
    value: <the workspace token generated in the previous step> #example 8p232d9mo4cvea9w
  - name: base_paradime_bolt_url
    value: <the Paradime URL of your instance, make sure to include /bolt/run_id/> # Example
  - name: pythonVersion
    value: '3.8'

  vmImage: ubuntu-latest

- task: UsePythonVersion@0
    versionSpec: $(pythonVersion)
    addToPath: true
  displayName: 'Set Python Version'

- script: |
    python -m pip install --upgrade pip
    pip install paradime-io
  displayName: 'Install dependencies'

- script: |
    python -c "
    import time
    from paradime import Paradime
    # Create a Paradime client with your API credentials
    paradime = Paradime(api_endpoint='${{variables.api_endpoint}}', api_key='$(API_KEY)', api_secret='$(API_SECRET)')
    # Trigger a run of the Bolt schedule and get the run ID
    run_id = paradime.bolt.trigger_run(schedule_name='${{variables.schedule_name}}')
    print(f'Triggered Bolt Run. View run details: "${{variables.base_paradime_bolt_url}}"{run_id}?workspaceToken="${{variables.workspace_token}}"')
    # Continuously check the run status
    while True:
      run_status = paradime.bolt.get_run_status(run_id)
      print(f'Run Status: {run_status}')
      if run_status != 'RUNNING':
        break  # Exit loop if status is anything other than RUNNING
      time.sleep(10)  # Wait for 10 seconds before checking again

    exit(0 if run_status == 'SUCCESS' else 1)
  displayName: 'Trigger and Monitor Paradime Bolt Run'
  # Define a timeout for this job
  timeoutInMinutes: 60

Add the API keys and Credential in the Azure Pipeline variables

Finally you need to add the API key and credentials generated in the previous step in Azure Pipelines.

Set the corresponding values using your credentials for the variable names:



Last updated