Paradime Help Docs
Get Started
  • 🚀Introduction
  • 📃Guides
    • Paradime 101
      • Getting Started with your Paradime Workspace
        • Creating a Workspace
        • Setting Up Data Warehouse Connections
        • Managing workspace configurations
        • Managing Users in the Workspace
      • Getting Started with the Paradime IDE
        • Setting Up a dbt™ Project
        • Creating a dbt™ Model
        • Data Exploration in the Code IDE
        • DinoAI: Accelerating Your Analytics Engineering Workflow
          • DinoAI Agent
            • Creating dbt Sources from Data Warehouse
            • Generating Base Models
            • Building Intermediate/Marts Models
            • Documentation Generation
            • Data Pipeline Configuration
            • Using .dinorules to Tailor Your AI Experience
          • Accelerating GitOps
          • Accelerating Data Governance
          • Accelerating dbt™ Development
        • Utilizing Advanced Developer Features
          • Visualize Data Lineage
          • Auto-generated Data Documentation
          • Enforce SQL and YAML Best Practices
          • Working with CSV Files
      • Managing dbt™ Schedules with Bolt
        • Creating Bolt Schedules
        • Understanding schedule types and triggers
        • Viewing Run History and Analytics
        • Setting Up Notifications
        • Debugging Failed Runs
    • Migrating from dbt™ cloud to Paradime
  • 🔍Concepts
    • Working with Git
      • Git Lite
      • Git Advanced
      • Read Only Branches
      • Delete Branches
      • Merge Conflicts
      • Configuring Signed Commits on Paradime with SSH Keys
    • dbt™ fundamentals
      • Getting started with dbt™
        • Introduction
        • Project Strucuture
        • Working with Sources
        • Testing Data Quality
        • Models and Transformations
      • Configuring your dbt™ Project
        • Setting up your dbt_project.yml
        • Defining Your Sources in sources.yml
        • Testing Source Freshness
        • Unit Testing
        • Working with Tags
        • Managing Seeds
        • Environment Management
        • Variables and Parameters
        • Macros
        • Custom Tests
        • Hooks & Operational Tasks
        • Packages
      • Model Materializations
        • Table Materialization
        • View​ Materialization
        • Incremental Materialization
          • Using Merge for Incremental Models
          • Using Delete+Insert for Incremental Models
          • Using Append for Incremental Models
          • Using Microbatch for Incremental Models
        • Ephemeral Materialization
        • Snapshots
      • Running dbt™
        • Mastering the dbt™ CLI
          • Commands
          • Methods
          • Selector Methods
          • Graph Operators
    • Paradime fundamentals
      • Global Search
        • Paradime Apps Navigation
        • Invite users to your workspace
        • Search and preview Bolt schedules status
      • Using --defer in Paradime
      • Workspaces and data mesh
    • Data Warehouse essentials
      • BigQuery Multi-Project Service Account
  • 📖Documentation
    • DinoAI
      • Agent Mode
        • Use Cases
          • Creating Sources from your Warehouse
          • Generating dbt™ models
          • Fixing Errors with Jira
          • Researching with Perplexity
          • Providing Additional Context Using PDFs
      • Context
        • File Context
        • Directory Context
      • Tools and Features
        • Warehouse Tool
        • File System Tool
        • PDF Tool
        • Jira Tool
        • Perplexity Tool
        • Terminal Tool
        • Coming Soon Tools...
      • .dinorules
      • Ask Mode
      • Version Control
      • Production Pipelines
      • Data Documentation
    • Code IDE
      • User interface
        • Autocompletion
        • Context Menu
        • Flexible layout
        • "Peek" and "Go To" Definition
        • IDE preferences
        • Shortcuts
      • Left Panel
        • DinoAI Coplot
        • Search, Find, and Replace
        • Git Lite
        • Bookmarks
      • Command Panel
        • Data Explorer
        • Lineage
        • Catalog
        • Lint
      • Terminal
        • Running dbt™
        • Paradime CLI
      • Additional Features
        • Scratchpad
    • Bolt
      • Creating Schedules
        • 1. Schedule Settings
        • 2. Command Settings
          • dbt™ Commands
          • Python Scripts
          • Elementary Commands
          • Lightdash Commands
          • Tableau Workbook Refresh
          • Power BI Dataset Refresh
          • Paradime Bolt Schedule Toggle Commands
          • Monte Carlo Commands
        • 3. Trigger Types
        • 4. Notification Settings
        • Templates
          • Run and Test all your dbt™ Models
          • Snapshot Source Data Freshness
          • Build and Test Models with New Source Data
          • Test Code Changes On Pull Requests
          • Re-executes the last dbt™ command from the point of failure
          • Deploy Code Changes On Merge
          • Create Jira Tickets
          • Trigger Census Syncs
          • Trigger Hex Projects
          • Create Linear Issues
          • Create New Relic Incidents
          • Create Azure DevOps Items
        • Schedules as Code
      • Managing Schedules
        • Schedule Configurations
        • Viewing Run Log History
        • Analyzing Individual Run Details
          • Configuring Source Freshness
      • Bolt API
      • Special Environment Variables
        • Audit environment variables
        • Runtime environment variables
      • Integrations
        • Reverse ETL
          • Hightouch
        • Orchestration
          • Airflow
          • Azure Data Factory (ADF)
      • CI/CD
        • Turbo CI
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
          • Paradime Turbo CI Schema Cleanup
        • Continuous Deployment with Bolt
          • GitHub Native Continuous Deployment
          • Using Azure Pipelines
          • Using BitBucket Pipelines
          • Using GitLab Pipelines
        • Column-Level Lineage Diff
          • dbt™ mesh
          • Looker
          • Tableau
          • Thoughtspot
    • Radar
      • Get Started
      • Cost Management
        • Snowflake Cost Optimization
        • Snowflake Cost Monitoring
        • BigQuery Cost Monitoring
      • dbt™ Monitoring
        • Schedules Dashboard
        • Models Dashboard
        • Sources Dashboard
        • Tests Dashboard
      • Team Efficiency Tracking
      • Real-time Alerting
      • Looker Monitoring
    • Data Catalog
      • Data Assets
        • Looker assets
        • Tableau assets
        • Power BI assets
        • Sigma assets
        • ThoughtSpot assets
        • Fivetran assets
        • dbt™️ assets
      • Lineage
        • Search and Discovery
        • Filters and Nodes interaction
        • Nodes navigation
        • Canvas interactions
        • Compare Lineage version
    • Integrations
      • Dashboards
        • Sigma
        • ThoughtSpot (Beta)
        • Lightdash
        • Tableau
        • Looker
        • Power BI
        • Streamlit
      • Code IDE
        • Cube CLI
        • dbt™️ generator
        • Prettier
        • Harlequin
        • SQLFluff
        • Rainbow CSV
        • Mermaid
          • Architecture Diagrams
          • Block Diagrams Documentation
          • Class Diagrams
          • Entity Relationship Diagrams
          • Gantt Diagrams
          • GitGraph Diagrams
          • Mindmaps
          • Pie Chart Diagrams
          • Quadrant Charts
          • Requirement Diagrams
          • Sankey Diagrams
          • Sequence Diagrams
          • State Diagrams
          • Timeline Diagrams
          • User Journey Diagrams
          • XY Chart
          • ZenUML
        • pre-commit
          • Paradime Setup and Configuration
          • dbt™️-checkpoint hooks
            • dbt™️ Model checks
            • dbt™️ Script checks
            • dbt™️ Source checks
            • dbt™️ Macro checks
            • dbt™️ Modifiers
            • dbt™️ commands
            • dbt™️ checks
          • SQLFluff hooks
          • Prettier hooks
      • Observability
        • Elementary Data
          • Anomaly Detection Tests
            • Anomaly tests parameters
            • Volume anomalies
            • Freshness anomalies
            • Event freshness anomalies
            • Dimension anomalies
            • All columns anomalies
            • Column anomalies
          • Schema Tests
            • Schema changes
            • Schema changes from baseline
          • Sending alerts
            • Slack alerts
            • Microsoft Teams alerts
            • Alerts Configuration and Customization
          • Generate observability report
          • CLI commands and usage
        • Monte Carlo
      • Storage
        • Amazon S3
        • Snowflake Storage
      • Reverse ETL
        • Hightouch
      • CI/CD
        • GitHub
        • Spectacles
      • Notifications
        • Microsoft Teams
        • Slack
      • ETL
        • Fivetran
    • Security
      • Single Sign On (SSO)
        • Okta SSO
        • Azure AD SSO
        • Google SAML SSO
        • Google Workspace SSO
        • JumpCloud SSO
      • Audit Logs
      • Security model
      • Privacy model
      • FAQs
      • Trust Center
      • Security
    • Settings
      • Workspaces
      • Git Repositories
        • Importing a repository
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
        • Update connected git repository
      • Connections
        • Code IDE environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Scheduler environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Manage connections
          • Set alternative default connection
          • Delete connections
        • Cost connection
          • BigQuery cost connection
          • Snowflake cost connection
        • Connection Security
          • AWS PrivateLink
            • Snowflake PrivateLink
            • Redshift PrivateLink
          • BigQuery OAuth
          • Snowflake OAuth
        • Optional connection attributes
      • Notifications
      • dbt™
        • Upgrade dbt Core™ version
      • Users
        • Invite users
        • Manage Users
        • Enable Auto-join
        • Users and licences
        • Default Roles and Permissions
        • Role-based access control
      • Environment Variables
        • Bolt Schedules Environment Variables
        • Code IDE Environment Variables
  • 💻Developers
    • GraphQL API
      • Authentication
      • Examples
        • Audit Logs API
        • Bolt API
        • User Management API
        • Workspace Management API
    • Python SDK
      • Getting Started
      • Modules
        • Audit Log
        • Bolt
        • Lineage Diff
        • Custom Integration
        • User Management
        • Workspace Management
    • Paradime CLI
      • Getting Started
      • Bolt CLI
    • Webhooks
      • Getting Started
      • Custom Webhook Guides
        • Create an Azure DevOps Work item when a Bolt run complete with errors
        • Create a Linear Issue when a Bolt run complete with errors
        • Create a Jira Issue when a Bolt run complete with errors
        • Trigger a Slack notification when a Bolt run is overrunning
    • Virtual Environments
      • Using Poetry
      • Troubleshooting
    • API Keys
    • IP Restrictions in Paradime
    • Company & Workspace token
  • 🙌Best Practices
    • Data Mesh Setup
      • Configure Project dependencies
      • Model access
      • Model groups
  • ‼️Troubleshooting
    • Errors
    • Error List
    • Restart Code IDE
  • 🔗Other Links
    • Terms of Service
    • Privacy Policy
    • Paradime Blog
Powered by GitBook
On this page
  • Alert properties in .yml files
  • Alert content
  • ​Owner
  • Subscribers
  • Test description
  • Tags
  • Alerts distribution
  • Suppression interval
  • Group alerts by table
  • Alert fields
  • Alerts CLI flags
  • Filter alerts
  • Group alerts by table
  • Suppression interval flag

Was this helpful?

  1. Documentation
  2. Integrations
  3. Observability
  4. Elementary Data
  5. Sending alerts

Alerts Configuration and Customization

PreviousMicrosoft Teams alertsNextGenerate observability report

Last updated 7 months ago

Was this helpful?

You can enrich your alerts by adding properties to tests, models and sources in your .yml files. The supported attributes are: , , , .

You can configure and customize your alerts by configuring:

  • (for test alerts only)

Alert properties in .yml files

Elementary prioritizes configuration in the following order:

For models / sources:

  1. Model config block.

  2. Model properties.

  3. Model path configuration under models key in dbt_project.yml.

For tests:

  1. Test properties.

  2. Tests path configuration under tests key in dbt_project.yml.

  3. Parent model configuration.

dbt_project.yml
meta:
  owner: "@jessica.jones"
  subscribers: ["@jessica.jones", "@joe.joseph"]
  description: "This is the test description"
  tags: ["#marketing", "#data_ops"]
  channel: data_ops
  alert_suppression_interval: 24
  slack_group_alerts_by: table
  alert_fields: ["description", "owners", "tags", "subscribers", ...]

Alert content

  • You can configure a single owner or a list of owners (["@jessica.jones", "@joe.joseph"]).

models:
  - name: my_model_name
    meta:
      owner: "@jessica.jones"
tests:
  - not_null:
    meta:
      owner: ["@jessica.jones", "@joe.joseph"]
{{ config(
    tags=["Tag1","Tag2"]
    meta={
        "description": "This is a description",
        "owner": "@jessica.jones"
    }
) }}
models:
  path:
    subfolder:
      +meta:
        owner: "@jessica.jones"

tests:
  path:
    subfolder:
      +meta:
        owner: "@jessica.jones"

Subscribers

If you want additional users besides the owner to be tagged on an alert, add them as subscribers.

  • You can configure a single subscriber or a list (["@jessica.jones", "@joe.joseph"]).

models:
  - name: my_model_name
    meta:
      subscribers: "@jessica.jones"
tests:
  - not_null:
    meta:
      subscribers: ["@jessica.jones", "@joe.joseph"]
{{ config(
    meta={
        "subscribers": "@jessica.jones"
    }
) }}
models:
  path:
    subfolder:
      +meta:
        subscribers: "@jessica.jones"

tests:
  path:
    subfolder:
      +meta:
        subscribers: "@jessica.jones"

Test description

Elementary supports configuring description for tests that are included in alerts. It's recommended to add an explanation of what does it mean if this test fails, so alert will include this context.

tests:
  - not_null:
    meta:
      description: "This is the test description"
{{ config(
    tags=["Tag1","Tag2"]
    meta={
        description: "This is the test description"
    }
) }}
tests:
  path:
    subfolder:
      +meta:
        description: "This is the test description"

Tags

  • You can tag a group or a channel in a slack alert by adding #channel_name as a tag.

  • Tags are aggregated,so a test alert will include both the test and the parent model tags.

models:
  - name: my_model_name
    tags: ["#marketing", "#data_ops"]
tests:
  - not_null:
    tags: ["#marketing", "#data_ops"]
{{ config(
    tags=["#marketing", "#data_ops"]
    }
) }}
models:
  path:
    subfolder:
      tags: ["#marketing", "#data_ops"]

tests:
  path:
    subfolder:
      tags: ["#marketing", "#data_ops"]

Alerts distribution

Elementary allows you to customize alerts to distribute the right information to the right people. This way you can ensure your alerts are valuable and avoid alert fatigue.

Suppression interval

Don’t want to get multiple alerts if the same test keeps failing? You can now configure an alert_suppression_interval, this is a “snooze” period for alerts on the same issue.

The accepted value is in hours, so 1 day snooze is alert_suppression_interval: 24. Elementary won't send new alerts on the same issue that are generated within suppression interval.

Note: if you configure a suppression interval using this method, it will override the value in the global configuration.

models:
  - name: my_model_name
    meta:
      alert_suppression_interval: 24
tests:
  - not_null:
    meta:
      alert_suppression_interval: 12
{{ config(
    meta={
        "alert_suppression_interval": 24
    }
) }}
models:
  path:
    subfolder:
      +meta:
        alert_suppression_interval: 24

tests:
  path:
    subfolder:
      +meta:
        alert_suppression_interval: 48

Group alerts by table

By default, Elementary sends a single alert to notify on each failure with extensive information for fast triage.

Elementary also supports grouping alerts by table. In this case, a single Slack notification will be generated containing all issues associated with this table. The created notification will contain a union of the relevant owners, tags and subscribers.

Due to their nature, grouped alerts will contain less information on each issue.

models:
  - name: my_model_name
    meta:
      slack_group_alerts_by: table
tests:
  - not_null:
    meta:
      slack_group_alerts_by: table
{{ config(
    meta={
        "slack_group_alerts_by": "table"
    }
) }}
models:
  path:
    subfolder:
      +meta:
        slack_group_alerts_by: table

tests:
  path:
    subfolder:
      +meta:
        slack_group_alerts_by: table

Alert fields

Currently this feature is supported only by test alerts!

You can decide which fields to include in the alert, and create a format of alert that fits your use case and recipients. By default, all the fields are included in the alert.

Supported alert fields:

  • table: Displays the table name of the test

  • column: Displays the column name of the test

  • description: Displays the description of the test

  • owners: Displays the owners of the model on which the test is running

  • tags: Displays the dbt tags of the test/model

  • subscribers: Displays the subscribers of the test/model

  • result_message: Displays the returned message from the test result

  • test_parameters: Displays the parameters that were provided to the test

  • test_query: Displays the query of the test

  • test_results_sample: Displays a sample of the test results

models:
  - name: my_model_name
    meta:
      alert_fields: ["description", "owners", "tags", "subscribers"]
tests:
  - not_null:
    meta:
      alert_fields: ["description", "owners", "tags", "subscribers"]
{{ config(
    meta={
        "alert_fields": "['description', 'owners', 'tags', 'subscribers']"
    }
) }}
models:
  path:
    subfolder:
      +meta:
        alert_fields: ["description", "owners", "tags", "subscribers"]

tests:
  path:
    subfolder:
      +meta:
        alert_fields: ["description", "owners", "tags", "subscribers"]

Alerts CLI flags

Filter alerts

Elementary supports filtering the alerts by tag, owner, model, status or resource type.

Using filters, you can send alerts to the relevant people and teams by running edr multiple times with different filters on each run.

alerts on skipped tests and models are filtered out by default. if you want to receive those alerts, apply the statuses filter and include them explicitly.

edr monitor --filters tags:critical
edr monitor --filters tags:finance,marketing
edr monitor --filters owners:@jeff
edr monitor --filters owners:@jessy,@jeff
edr monitor --filters models:customers
edr monitor --filters models:orders,customers,signups
edr monitor --filters statuses:warn,fail
edr monitor --filters statuses:error
edr monitor --filters statuses:skipped
edr monitor --filters resource_types:model
edr monitor --filters resource_types:test
edr monitor --filters resource_types:test,source_freshness

The --filters flag can be used multiple times to apply multiple filters. The filters are combined using the logical AND operator. The comma , is used to separate multiple values for the same filter, creating a logical OR.

edr monitor --filters resource_types:model --filters tags:finance,marketing

Group alerts by table

By default, Elementary sends a single alert to notify on each failure with extensive information for fast triage.

Elementary also supports grouping alerts by table. In this case, a single Slack notification will be generated containing all issues associated with this table. The created notification will contain a union of the relevant owners, tags and subscribers.

Due to their nature, grouped alerts will contain less information on each issue.

edr monitor --group-by table

Suppression interval flag

Don’t want to get multiple alerts if the same test keeps failing? You can now configure an alert_suppression_interval, this is a “snooze” period for alerts on the same issue. Elementary won’t send new alerts on the same issue that are generated within suppression interval.

The flag configuration is the suppression interval duration in hours, and the default is 0 hours (no alert suppression). If configured otherwise in the dbt project config block or meta, the CLI value will be ignored (unless --override-dbt-project-config is used).

edr monitor --suppression-interval 24

Owner

Elementary enriches alerts with ).

If you want the owner to be tagged on slack use '@' and the email prefix of the slack user (@jessica.jones to tag ).

If you want the subscriber to be tagged on slack use '@' and the email prefix of the slack user (@jessica.jones to tag ).

You can use to provide context to your alerts.

📖
​
owners for models or tests
jessica.jones@marvel.com
jessica.jones@marvel.com
tags
owner
subscribers
description
tags
suppression interval
alert fields
alert grouping
alert filters