Run and Test all your dbt™ Models
Last updated
Last updated
This template creates a schedule to execute dbt run
and dbt test
across your entire dbt™ project, ensuring comprehensive model execution and validation. It serves as a foundational production schedule, automatically updating and validating all your models on a regular basis to maintain data freshness and quality.
Scheduler Environment is connected to your data warehouse provider.
Setting | Value | Explanation |
---|---|---|
Schedule Type |
| Ensures consistent execution for production workloads in a single environment. Best for regular data pipeline runs |
Schedule Name |
| Descriptive name that indicates purpose |
Git Branch |
| Uses your default production branch to ensure you're always running the latest approved code |
The template uses two sequential commands that work together to build and validate your data pipeline:
dbt run
: Executes SQL transformations to build or update all models in your dbt™ project, following your defined model dependencies
dbt test
: After models are built, runs your configured data tests to ensure data quality, including schema tests (unique, not null) and custom data quality tests
This sequence ensures that all your models are not only built but also validated before being used by downstream consumers.
For custom command configurations, see Command Settings documentation.
Type: Scheduled Run (Cron)
Cron Schedule: 0 */2 * * *
(Every 2 hours, starting at minute 0 to balance frequent data updates and reasonable resource usage)
For custom Trigger configurations, see Trigger Types documentation.
Email Alerts:
Success: Confirms all models were built and tested successfully, letting you know your data pipeline is healthy
Failure: Immediately alerts you when models fail to build or tests fail, allowing quick response to issues
SLA Breach: Alerts when runs take longer than the set duration (default: 2 hours), helping identify performance degradation
For custom notification configurations, see Notification Settings documentation.
Regular Production Updates: Keep production data fresh by regularly rebuilding models based on upstream changes. Essential for business reporting and dashboards that need current data.
Continuous Data Validation: Catch data quality issues early by running tests after every model build. Prevents bad data from flowing to downstream consumers.
Initial Project Setup: Get started quickly with a proven production schedule configuration that follows dbt™ best practices.
Tailor this template to your specific needs:
Adjust trigger type based on data freshness requirements:
Hourly updates for critical models (0 * * * *
)
Daily updates for standard reporting (0 0 * * *
)
Weekly updates for historical analysis (0 0 * * 0
)
Modify command settings to control what gets built and tested:
Build specific models:
dbt run --select finance.*+
(finance models and dependencies)
dbt run --select state:modified+
(changed models and dependents)
Test specific models:
dbt test --select tag:critical
(test critical models)
dbt test --select config.severity:error
(run error-level tests)
Add notification destinations (Slack, MS Teams) for team collaboration