Orchestration

Paradime provides an handy API to be able to include Bolt runs within your existing data pipeline,with tools such as Airflow, Prefect, Dagster, and Azure Data Factory (ADF).

Requirements

To trigger runs and check the status of a running schedules using the Bolt API you will need:

  • API Key

  • API Secret

  • API Endpoint

API Keys

API Endpoints

To integrate Bolt schedules we provide two endpoints within Paradime:

  • trigger a schedule run

  • check the status of a run

Schedule configurations

You will need a schedule name defined in the paradime_schedules.yml or in the UI.

For schedules you want to use with the Bolt API, make sure to set the schedule cron configuration as OFF (see example below).

Creating Schedules
paradime_schedules.yml
schedules:
  - name: operations_run
    schedule: "OFF"
    environment: production
    commands:
      - dbt seed
      - dbt run -m +fact_orders
    owner_email: "fabio@paradime.io"
    slack_notify:
      - "#tmp-alert-testing"
    slack_on:
      - passed
      - failed

Last updated