Paradime Help Docs
Get Started
  • 🚀Introduction
  • 📃Guides
    • Paradime 101
      • Getting Started with your Paradime Workspace
        • Creating a Workspace
        • Setting Up Data Warehouse Connections
        • Managing workspace configurations
        • Managing Users in the Workspace
      • Getting Started with the Paradime IDE
        • Setting Up a dbt™ Project
        • Creating a dbt™ Model
        • Data Exploration in the Code IDE
        • DinoAI: Accelerating Your Analytics Engineering Workflow
          • DinoAI Agent
            • Creating dbt Sources from Data Warehouse
            • Generating Base Models
            • Building Intermediate/Marts Models
            • Documentation Generation
            • Data Pipeline Configuration
            • Using .dinorules to Tailor Your AI Experience
          • Accelerating GitOps
          • Accelerating Data Governance
          • Accelerating dbt™ Development
        • Utilizing Advanced Developer Features
          • Visualize Data Lineage
          • Auto-generated Data Documentation
          • Enforce SQL and YAML Best Practices
          • Working with CSV Files
      • Managing dbt™ Schedules with Bolt
        • Creating Bolt Schedules
        • Understanding schedule types and triggers
        • Viewing Run History and Analytics
        • Setting Up Notifications
        • Debugging Failed Runs
    • Migrating from dbt™ cloud to Paradime
  • 🔍Concepts
    • Working with Git
      • Git Lite
      • Git Advanced
      • Read Only Branches
      • Delete Branches
      • Merge Conflicts
      • Configuring Signed Commits on Paradime with SSH Keys
      • GitHub Branch Protection Guide: Preventing Direct Commits to Main
    • dbt™ fundamentals
      • Getting started with dbt™
        • Introduction
        • Project Strucuture
        • Working with Sources
        • Testing Data Quality
        • Models and Transformations
      • Configuring your dbt™ Project
        • Setting up your dbt_project.yml
        • Defining Your Sources in sources.yml
        • Testing Source Freshness
        • Unit Testing
        • Working with Tags
        • Managing Seeds
        • Environment Management
        • Variables and Parameters
        • Macros
        • Custom Tests
        • Hooks & Operational Tasks
        • Packages
      • Model Materializations
        • Table Materialization
        • View​ Materialization
        • Incremental Materialization
          • Using Merge for Incremental Models
          • Using Delete+Insert for Incremental Models
          • Using Append for Incremental Models
          • Using Microbatch for Incremental Models
        • Ephemeral Materialization
        • Snapshots
      • Running dbt™
        • Mastering the dbt™ CLI
          • Commands
          • Methods
          • Selector Methods
          • Graph Operators
    • Paradime fundamentals
      • Global Search
        • Paradime Apps Navigation
        • Invite users to your workspace
        • Search and preview Bolt schedules status
      • Using --defer in Paradime
      • Workspaces and data mesh
    • Data Warehouse essentials
      • BigQuery Multi-Project Service Account
  • 📖Documentation
    • DinoAI
      • Agent Mode
        • Use Cases
          • Creating Sources from your Warehouse
          • Generating dbt™ models
          • Fixing Errors with Jira
          • Researching with Perplexity
          • Providing Additional Context Using PDFs
      • Context
        • File Context
        • Directory Context
      • Tools and Features
        • Warehouse Tool
        • File System Tool
        • PDF Tool
        • Jira Tool
        • Perplexity Tool
        • Terminal Tool
        • Coming Soon Tools...
      • .dinorules
      • Ask Mode
      • Version Control
      • Production Pipelines
      • Data Documentation
    • Code IDE
      • User interface
        • Autocompletion
        • Context Menu
        • Flexible layout
        • "Peek" and "Go To" Definition
        • IDE preferences
        • Shortcuts
      • Left Panel
        • DinoAI Coplot
        • Search, Find, and Replace
        • Git Lite
        • Bookmarks
      • Command Panel
        • Data Explorer
        • Lineage
        • Catalog
        • Lint
      • Terminal
        • Running dbt™
        • Paradime CLI
      • Additional Features
        • Scratchpad
    • Bolt
      • Creating Schedules
        • 1. Schedule Settings
        • 2. Command Settings
          • dbt™ Commands
          • Python Scripts
          • Elementary Commands
          • Lightdash Commands
          • Tableau Workbook Refresh
          • Power BI Dataset Refresh
          • Paradime Bolt Schedule Toggle Commands
          • Monte Carlo Commands
        • 3. Trigger Types
        • 4. Notification Settings
        • Templates
          • Run and Test all your dbt™ Models
          • Snapshot Source Data Freshness
          • Build and Test Models with New Source Data
          • Test Code Changes On Pull Requests
          • Re-executes the last dbt™ command from the point of failure
          • Deploy Code Changes On Merge
          • Create Jira Tickets
          • Trigger Census Syncs
          • Trigger Hex Projects
          • Create Linear Issues
          • Create New Relic Incidents
          • Create Azure DevOps Items
        • Schedules as Code
      • Managing Schedules
        • Schedule Configurations
        • Viewing Run Log History
        • Analyzing Individual Run Details
          • Configuring Source Freshness
      • Bolt API
      • Special Environment Variables
        • Audit environment variables
        • Runtime environment variables
      • Integrations
        • Reverse ETL
          • Hightouch
        • Orchestration
          • Airflow
          • Azure Data Factory (ADF)
      • CI/CD
        • Turbo CI
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
          • Paradime Turbo CI Schema Cleanup
        • Continuous Deployment with Bolt
          • GitHub Native Continuous Deployment
          • Using Azure Pipelines
          • Using BitBucket Pipelines
          • Using GitLab Pipelines
        • Column-Level Lineage Diff
          • dbt™ mesh
          • Looker
          • Tableau
          • Thoughtspot
    • Radar
      • Get Started
      • Cost Management
        • Snowflake Cost Optimization
        • Snowflake Cost Monitoring
        • BigQuery Cost Monitoring
      • dbt™ Monitoring
        • Schedules Dashboard
        • Models Dashboard
        • Sources Dashboard
        • Tests Dashboard
      • Team Efficiency Tracking
      • Real-time Alerting
      • Looker Monitoring
    • Data Catalog
      • Data Assets
        • Looker assets
        • Tableau assets
        • Power BI assets
        • Sigma assets
        • ThoughtSpot assets
        • Fivetran assets
        • dbt™️ assets
      • Lineage
        • Search and Discovery
        • Filters and Nodes interaction
        • Nodes navigation
        • Canvas interactions
        • Compare Lineage version
    • Integrations
      • Dashboards
        • Sigma
        • ThoughtSpot (Beta)
        • Lightdash
        • Tableau
        • Looker
        • Power BI
        • Streamlit
      • Code IDE
        • Cube CLI
        • dbt™️ generator
        • Prettier
        • Harlequin
        • SQLFluff
        • Rainbow CSV
        • Mermaid
          • Architecture Diagrams
          • Block Diagrams Documentation
          • Class Diagrams
          • Entity Relationship Diagrams
          • Gantt Diagrams
          • GitGraph Diagrams
          • Mindmaps
          • Pie Chart Diagrams
          • Quadrant Charts
          • Requirement Diagrams
          • Sankey Diagrams
          • Sequence Diagrams
          • State Diagrams
          • Timeline Diagrams
          • User Journey Diagrams
          • XY Chart
          • ZenUML
        • pre-commit
          • Paradime Setup and Configuration
          • dbt™️-checkpoint hooks
            • dbt™️ Model checks
            • dbt™️ Script checks
            • dbt™️ Source checks
            • dbt™️ Macro checks
            • dbt™️ Modifiers
            • dbt™️ commands
            • dbt™️ checks
          • SQLFluff hooks
          • Prettier hooks
      • Observability
        • Elementary Data
          • Anomaly Detection Tests
            • Anomaly tests parameters
            • Volume anomalies
            • Freshness anomalies
            • Event freshness anomalies
            • Dimension anomalies
            • All columns anomalies
            • Column anomalies
          • Schema Tests
            • Schema changes
            • Schema changes from baseline
          • Sending alerts
            • Slack alerts
            • Microsoft Teams alerts
            • Alerts Configuration and Customization
          • Generate observability report
          • CLI commands and usage
        • Monte Carlo
      • Storage
        • Amazon S3
        • Snowflake Storage
      • Reverse ETL
        • Hightouch
      • CI/CD
        • GitHub
        • Spectacles
      • Notifications
        • Microsoft Teams
        • Slack
      • ETL
        • Fivetran
    • Security
      • Single Sign On (SSO)
        • Okta SSO
        • Azure AD SSO
        • Google SAML SSO
        • Google Workspace SSO
        • JumpCloud SSO
      • Audit Logs
      • Security model
      • Privacy model
      • FAQs
      • Trust Center
      • Security
    • Settings
      • Workspaces
      • Git Repositories
        • Importing a repository
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
        • Update connected git repository
      • Connections
        • Code IDE environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Scheduler environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Manage connections
          • Set alternative default connection
          • Delete connections
        • Cost connection
          • BigQuery cost connection
          • Snowflake cost connection
        • Connection Security
          • AWS PrivateLink
            • Snowflake PrivateLink
            • Redshift PrivateLink
          • BigQuery OAuth
          • Snowflake OAuth
        • Optional connection attributes
      • Notifications
      • dbt™
        • Upgrade dbt Core™ version
      • Users
        • Invite users
        • Manage Users
        • Enable Auto-join
        • Users and licences
        • Default Roles and Permissions
        • Role-based access control
      • Environment Variables
        • Bolt Schedules Environment Variables
        • Code IDE Environment Variables
  • 💻Developers
    • GraphQL API
      • Authentication
      • Examples
        • Audit Logs API
        • Bolt API
        • User Management API
        • Workspace Management API
    • Python SDK
      • Getting Started
      • Modules
        • Audit Log
        • Bolt
        • Lineage Diff
        • Custom Integration
        • User Management
        • Workspace Management
    • Paradime CLI
      • Getting Started
      • Bolt CLI
    • Webhooks
      • Getting Started
      • Custom Webhook Guides
        • Create an Azure DevOps Work item when a Bolt run complete with errors
        • Create a Linear Issue when a Bolt run complete with errors
        • Create a Jira Issue when a Bolt run complete with errors
        • Trigger a Slack notification when a Bolt run is overrunning
    • Virtual Environments
      • Using Poetry
      • Troubleshooting
    • API Keys
    • IP Restrictions in Paradime
    • Company & Workspace token
  • 🙌Best Practices
    • Data Mesh Setup
      • Configure Project dependencies
      • Model access
      • Model groups
  • ‼️Troubleshooting
    • Errors
    • Error List
    • Restart Code IDE
  • 🔗Other Links
    • Terms of Service
    • Privacy Policy
    • Paradime Blog
Powered by GitBook
On this page
  • What are Paradime YAML-Based Schedules?
  • Why Use YAML-Based Schedules?
  • How YAML-bases schedules are deployed?
  • Configuration Reference
  • Base Configuration
  • Execution Modes
  • Notifications Configuration
  • Example: Complete Configuration
  • Best Practices
  • Paradime schedules terminal commands

Was this helpful?

  1. Documentation
  2. Bolt
  3. Creating Schedules

Schedules as Code

PreviousCreate Azure DevOps ItemsNextManaging Schedules

Last updated 6 months ago

Was this helpful?

What are Paradime YAML-Based Schedules?

Paradime YAML schedules are configuration-as-code definitions, allowing you to define, version, and manage your data pipeline schedules directly within your dbt project repository. These schedules are configured in a single file named paradime_schedules.yml located in the root directory of your dbt project (alongside dbt_project.yml).

Prerequisites

  • In order to run yaml-based schedules, connect your data warehouse to the .

File Location and Structure

your-dbt-project/
├── dbt_project.yml
├── paradime_schedules.yml    # All schedules must be defined here
├── models/
├── tests/
└── ...

Why Use YAML-Based Schedules?

  1. Version Control

    • Schedule configurations are tracked alongside your dbt models

    • Review schedule modifications through Pull Requests

    • Enforce team review processes for schedule changes

  2. Infrastructure as Code

    • Schedules are treated as code, not just UI configurations

    • Easy replication across environments

    • Enables automated deployment pipelines

  3. Team Collaboration

    • Simplified schedule review process

    • Standard formatting and validation

    • Documentation lives with the code

How YAML-bases schedules are deployed?

Schedules are always read from the paradime_schedules.yml file on your default branch (usually main or master).

  • Automatic Refresh: Paradime checks for changes every 10 minutes.

  • Manual Refresh: For immediate updates, navigate to the Bolt interface and click Parse Schedules.

💡 Note: To update your schedules, make sure to merge your changes to the default branch first.


Configuration Reference

This section describes the YAML configuration format for scheduling and managing automated tasks. The configuration supports various execution modes including scheduled runs, trigger-based execution, deferred scheduling, CI/CD integration and API.

Base Configuration

Every scheduler configuration must include these basic fields:

name: string               # Name of the schedule
description: string        # Job description
owner_email: string       # Email of the job owner
git_branch: string        # Target Git branch
environment: string       # Only "production" is supported
commands:                 # List of commands to execute
  - string

Execution Modes

1. Schedule-Triggered Execution

Basic scheduled execution using cron expression:

schedule: '0 */2 * * *'     # Runs every 2 hours

2. Run Completion Trigger

Triggers execution based on completion of another job:

schedule: 'OFF'
schedule_trigger:
  enabled: true
  schedule_name: string     # Name of the trigger schedule
  workspace_name: string    # Workspace containing the trigger schedule
  trigger_on:               # Events that trigger execution
    - passed
    - failed

3. Merge Trigger

Triggers execution on merge events:

schedule: 'OFF'
trigger_on_merge: true     # Enables merge-triggered execution

4. Deferred Scheduling

Allows schedules to used dbt defer to artifacts comparison:

schedule: 'OFF'
deferred_schedule:
  enabled: true
  deferred_schedule_name: string  # Name of the deferred schedule
  successful_run_only: boolean    

5. Turbo CI Configuration

Configuration for CI pipelines:

schedule: 'OFF'
turbo_ci:
  enabled: true
  deferred_schedule_name: string  # Name of the schedule for CI
  successful_run_only: boolean    # Whether to run only after successful executions

6. API Configuration

Basic configuration when triggering Bolt via API:

schedule: 'OFF'

7. Suspended State

Configuration for suspended jobs:

suspended: true            # Indicates the job is suspended

Notifications Configuration

Notifications can be configured for various events through multiple channels:

notifications:
  emails:                  # Email notifications configuration
    - address: string      # List of recipient email address
      events:              # List of events to notify for each recipient
        - passed           # Schedule completed successfully
        - failed          # Schedule completed successfully with errors
        - sla             # SLA threshold exceeded
        
  microsoft_teams:         # Microsoft Teams notifications
    - channel: string      # List of Teams channel name
      events:              # List of events to notify for each recipient
        - passed          # Schedule completed successfully
        - failed          # Schedule completed successfully with errors
        - sla             # SLA threshold exceeded
        
  slack_channels:
    - channel: string      # List of Slack channel name
      events:              # List of events to notify for each recipient
        - passed          # Schedule completed successfully
        - failed          # Schedule completed successfully with errors
        - sla             # SLA threshold exceeded
        
sla_minutes: number        # SLA threshold in minutes

For Slack and MS Teams notifications, check our integrations guide:

Example: Complete Configuration

schedules:
  - name: daily run
    description: "Daily build of all dbt models"
    owner_email: john@acme.io 
    environment: production
    git_branch: main  
    commands:
      - dbt run
      - dbt test
    schedule: 0 10 * * *    
    sla_minutes: 60
    notifications:
      emails:                 
        - address: john@acme.io 
          events:
            - failed
            - sla
      slack_channels:
        - channel: data-team    
          events:            
            - passed         
            - failed         
        - channel: pipeline-monitoring    
          events:                    
            - failed         
            - sla  
schedules:
  - name: "Finance reports update"
    description: "Update all finance models"
    owner_email: john@acme.io 
    environment: production
    git_branch: main  
    commands:
      - dbt run --select tag:finance
    schedule: 'OFF'
    schedule_trigger:
      enabled: true
      schedule_name: "Daily Run"
      workspace_name: data-platform
      trigger_on:
        - passed
        - failed  
    sla_minutes: 60
    notifications:
      emails:                 
        - address: john@acme.io 
          events:
            - failed
            - sla
      slack_channels:
        - channel: data-team    
          events:            
            - passed         
            - failed         
        - channel: pipeline-monitoring    
          events:                    
            - failed         
            - sla  
schedules:
  - name: "On Merge run CD"
    description: "Continuos deployment run to deploy changes as soon as Merged to the MAIN branch"
    owner_email: john@acme.io 
    environment: production
    git_branch: main  
    commands:
      - dbt run --select state:modified+
    schedule: 'OFF'
    trigger_on_merge: true 
    deferred_schedule:
      enabled: true
      deferred_schedule_name: "On Merge run CD"  
      successful_run_only: True     
    sla_minutes: 60
    notifications:
      emails:                 
        - address: john@acme.io 
          events:
            - failed
      slack_channels:
        - channel: data-team    
          events:            
            - passed         
            - failed         
        - channel: pipeline-monitoring    
          events:                    
            - failed         

schedules:
  - name: "High Frequency run"
    description: "Trigger run and build models only when new data landed in sources tables"
    owner_email: john@acme.io 
    environment: production
    git_branch: main  
    commands:
      - dbt source freshness
      - dbt build --select source_status:fresher+ state:modified+ result:error+ result:fail+
    schedule: '0,30 6-23 * * *'
    deferred_schedule:
      enabled: true
      deferred_schedule_name: "High Frequency run"
      successful_run_only: False     
    sla_minutes: 60
    notifications:
      emails:                 
        - address: john@acme.io 
          events:
            - failed
            - sla
      slack_channels:
        - channel: data-team    
          events:            
            - passed         
            - failed         
        - channel: pipeline-monitoring    
          events:                    
            - failed         
            - sla

schedules:
  - name: "Turbo CI run"
    description: "Trigger run and build models when a Pull Request is opened in a temporary schema"
    owner_email: john@acme.io 
    environment: production
    git_branch: main  
    commands:
      - dbt build --select state:modified+
    schedule: 'OFF'
    deferred_schedule:
      enabled: true
      deferred_schedule_name: "High Frequency run"
      successful_run_only: False     
    sla_minutes: 60
    notifications:
      emails:                 
        - address: john@acme.io 
          events:
            - failed
            - sla
      slack_channels:
        - channel: data-team    
          events:            
            - passed         
            - failed         
        - channel: pipeline-monitoring    
          events:                    
            - failed         
            - sla 

Best Practices

Schedule Format

  • Use standard cron expressions for scheduling

    • ✅ Standard cron to define days 0-6

      • 10 * * * 0-6 : At minute 10 on every day-of-week from Sunday through Saturday.

    • ❌ Non-standard cron to define days 1-7

      • 10 * * * 1-7 : At minute 10 on every day-of-week from Monday through Sunday

  • Use 'OFF' to disable scheduled execution

SLA Configuration

  • sla_minutes should be set based on job complexity

    Consider dependencies when setting SLA

  • Recommended minimum: 30 minutes

Notification Configuration

  • Configure at least one notification channel

  • Include critical events (failed, SLA) in notifications

  • Use team channels for collaborative workflows


Paradime schedules terminal commands

Before running any of the following commands, navigate to your dbt™️ project directory where paradime_schedules.yml is located.

CLI command
Description

Validate File Format - This command checks the paradime_schedules.yml for formatting errors and outputs the result.

Run Schedule Locally - To run all defined schedules based on your local context: (ie. based on your development environment and your current branch).

Run Selected Schedule Locally - To run the named schedule based on your local context: (ie. based on your development environment and your current branch).

Dry run - To simulate all schedule executions without running dbt™️ models.

Dry run - To simulate the named schedule executions without running dbt™️ models.

💡 Looking for complete examples? Jump to the section below.

Requires .

For more details on Paradime APIs check our .

Use to validate your cron expressions

Make sure to set the Slack / MS Teams Channel or Email for System notifications. Check our guide here for s

📖
paradime schedule verify
paradime schedule run
paradime schedule run <schedule_name>
paradime schedule run --dry-run
paradime schedule run --dry-run <schedule_name>
Scheduler Environment
GitHub integration
Developers guide
Slack
Microsoft Teams
crontab.guru
Example Configurations
Notifications Settings