Paradime Help Docs
Get Started
  • 🚀Introduction
  • 📃Guides
    • Paradime 101
      • Getting Started with your Paradime Workspace
        • Creating a Workspace
        • Setting Up Data Warehouse Connections
        • Managing workspace configurations
        • Managing Users in the Workspace
      • Getting Started with the Paradime IDE
        • Setting Up a dbt™ Project
        • Creating a dbt™ Model
        • Data Exploration in the Code IDE
        • DinoAI: Accelerating Your Analytics Engineering Workflow
          • DinoAI Agent
            • Creating dbt Sources from Data Warehouse
            • Generating Base Models
            • Building Intermediate/Marts Models
            • Documentation Generation
            • Data Pipeline Configuration
            • Using .dinorules to Tailor Your AI Experience
          • Accelerating GitOps
          • Accelerating Data Governance
          • Accelerating dbt™ Development
        • Utilizing Advanced Developer Features
          • Visualize Data Lineage
          • Auto-generated Data Documentation
          • Enforce SQL and YAML Best Practices
          • Working with CSV Files
      • Managing dbt™ Schedules with Bolt
        • Creating Bolt Schedules
        • Understanding schedule types and triggers
        • Viewing Run History and Analytics
        • Setting Up Notifications
        • Debugging Failed Runs
    • Migrating from dbt™ cloud to Paradime
  • 🔍Concepts
    • Working with Git
      • Git Lite
      • Git Advanced
      • Read Only Branches
      • Delete Branches
      • Merge Conflicts
      • Configuring Signed Commits on Paradime with SSH Keys
    • dbt™ fundamentals
      • Getting started with dbt™
        • Introduction
        • Project Strucuture
        • Working with Sources
        • Testing Data Quality
        • Models and Transformations
      • Configuring your dbt™ Project
        • Setting up your dbt_project.yml
        • Defining Your Sources in sources.yml
        • Testing Source Freshness
        • Unit Testing
        • Working with Tags
        • Managing Seeds
        • Environment Management
        • Variables and Parameters
        • Macros
        • Custom Tests
        • Hooks & Operational Tasks
        • Packages
      • Model Materializations
        • Table Materialization
        • View​ Materialization
        • Incremental Materialization
          • Using Merge for Incremental Models
          • Using Delete+Insert for Incremental Models
          • Using Append for Incremental Models
          • Using Microbatch for Incremental Models
        • Ephemeral Materialization
        • Snapshots
      • Running dbt™
        • Mastering the dbt™ CLI
          • Commands
          • Methods
          • Selector Methods
          • Graph Operators
    • Paradime fundamentals
      • Global Search
        • Paradime Apps Navigation
        • Invite users to your workspace
        • Search and preview Bolt schedules status
      • Using --defer in Paradime
      • Workspaces and data mesh
    • Data Warehouse essentials
      • BigQuery Multi-Project Service Account
  • 📖Documentation
    • DinoAI
      • Agent Mode
        • Use Cases
          • Creating Sources from your Warehouse
          • Generating dbt™ models
          • Fixing Errors with Jira
          • Researching with Perplexity
          • Providing Additional Context Using PDFs
      • Context
        • File Context
        • Directory Context
      • Tools and Features
        • Warehouse Tool
        • File System Tool
        • PDF Tool
        • Jira Tool
        • Perplexity Tool
        • Terminal Tool
        • Coming Soon Tools...
      • .dinorules
      • Ask Mode
      • Version Control
      • Production Pipelines
      • Data Documentation
    • Code IDE
      • User interface
        • Autocompletion
        • Context Menu
        • Flexible layout
        • "Peek" and "Go To" Definition
        • IDE preferences
        • Shortcuts
      • Left Panel
        • DinoAI Coplot
        • Search, Find, and Replace
        • Git Lite
        • Bookmarks
      • Command Panel
        • Data Explorer
        • Lineage
        • Catalog
        • Lint
      • Terminal
        • Running dbt™
        • Paradime CLI
      • Additional Features
        • Scratchpad
    • Bolt
      • Creating Schedules
        • 1. Schedule Settings
        • 2. Command Settings
          • dbt™ Commands
          • Python Scripts
          • Elementary Commands
          • Lightdash Commands
          • Tableau Workbook Refresh
          • Power BI Dataset Refresh
          • Paradime Bolt Schedule Toggle Commands
          • Monte Carlo Commands
        • 3. Trigger Types
        • 4. Notification Settings
        • Templates
          • Run and Test all your dbt™ Models
          • Snapshot Source Data Freshness
          • Build and Test Models with New Source Data
          • Test Code Changes On Pull Requests
          • Re-executes the last dbt™ command from the point of failure
          • Deploy Code Changes On Merge
          • Create Jira Tickets
          • Trigger Census Syncs
          • Trigger Hex Projects
          • Create Linear Issues
          • Create New Relic Incidents
          • Create Azure DevOps Items
        • Schedules as Code
      • Managing Schedules
        • Schedule Configurations
        • Viewing Run Log History
        • Analyzing Individual Run Details
          • Configuring Source Freshness
      • Bolt API
      • Special Environment Variables
        • Audit environment variables
        • Runtime environment variables
      • Integrations
        • Reverse ETL
          • Hightouch
        • Orchestration
          • Airflow
          • Azure Data Factory (ADF)
      • CI/CD
        • Turbo CI
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
          • Paradime Turbo CI Schema Cleanup
        • Continuous Deployment with Bolt
          • GitHub Native Continuous Deployment
          • Using Azure Pipelines
          • Using BitBucket Pipelines
          • Using GitLab Pipelines
        • Column-Level Lineage Diff
          • dbt™ mesh
          • Looker
          • Tableau
          • Thoughtspot
    • Radar
      • Get Started
      • Cost Management
        • Snowflake Cost Optimization
        • Snowflake Cost Monitoring
        • BigQuery Cost Monitoring
      • dbt™ Monitoring
        • Schedules Dashboard
        • Models Dashboard
        • Sources Dashboard
        • Tests Dashboard
      • Team Efficiency Tracking
      • Real-time Alerting
      • Looker Monitoring
    • Data Catalog
      • Data Assets
        • Looker assets
        • Tableau assets
        • Power BI assets
        • Sigma assets
        • ThoughtSpot assets
        • Fivetran assets
        • dbt™️ assets
      • Lineage
        • Search and Discovery
        • Filters and Nodes interaction
        • Nodes navigation
        • Canvas interactions
        • Compare Lineage version
    • Integrations
      • Dashboards
        • Sigma
        • ThoughtSpot (Beta)
        • Lightdash
        • Tableau
        • Looker
        • Power BI
        • Streamlit
      • Code IDE
        • Cube CLI
        • dbt™️ generator
        • Prettier
        • Harlequin
        • SQLFluff
        • Rainbow CSV
        • Mermaid
          • Architecture Diagrams
          • Block Diagrams Documentation
          • Class Diagrams
          • Entity Relationship Diagrams
          • Gantt Diagrams
          • GitGraph Diagrams
          • Mindmaps
          • Pie Chart Diagrams
          • Quadrant Charts
          • Requirement Diagrams
          • Sankey Diagrams
          • Sequence Diagrams
          • State Diagrams
          • Timeline Diagrams
          • User Journey Diagrams
          • XY Chart
          • ZenUML
        • pre-commit
          • Paradime Setup and Configuration
          • dbt™️-checkpoint hooks
            • dbt™️ Model checks
            • dbt™️ Script checks
            • dbt™️ Source checks
            • dbt™️ Macro checks
            • dbt™️ Modifiers
            • dbt™️ commands
            • dbt™️ checks
          • SQLFluff hooks
          • Prettier hooks
      • Observability
        • Elementary Data
          • Anomaly Detection Tests
            • Anomaly tests parameters
            • Volume anomalies
            • Freshness anomalies
            • Event freshness anomalies
            • Dimension anomalies
            • All columns anomalies
            • Column anomalies
          • Schema Tests
            • Schema changes
            • Schema changes from baseline
          • Sending alerts
            • Slack alerts
            • Microsoft Teams alerts
            • Alerts Configuration and Customization
          • Generate observability report
          • CLI commands and usage
        • Monte Carlo
      • Storage
        • Amazon S3
        • Snowflake Storage
      • Reverse ETL
        • Hightouch
      • CI/CD
        • GitHub
        • Spectacles
      • Notifications
        • Microsoft Teams
        • Slack
      • ETL
        • Fivetran
    • Security
      • Single Sign On (SSO)
        • Okta SSO
        • Azure AD SSO
        • Google SAML SSO
        • Google Workspace SSO
        • JumpCloud SSO
      • Audit Logs
      • Security model
      • Privacy model
      • FAQs
      • Trust Center
      • Security
    • Settings
      • Workspaces
      • Git Repositories
        • Importing a repository
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
        • Update connected git repository
      • Connections
        • Code IDE environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Scheduler environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Manage connections
          • Set alternative default connection
          • Delete connections
        • Cost connection
          • BigQuery cost connection
          • Snowflake cost connection
        • Connection Security
          • AWS PrivateLink
            • Snowflake PrivateLink
            • Redshift PrivateLink
          • BigQuery OAuth
          • Snowflake OAuth
        • Optional connection attributes
      • Notifications
      • dbt™
        • Upgrade dbt Core™ version
      • Users
        • Invite users
        • Manage Users
        • Enable Auto-join
        • Users and licences
        • Default Roles and Permissions
        • Role-based access control
      • Environment Variables
        • Bolt Schedules Environment Variables
        • Code IDE Environment Variables
  • 💻Developers
    • GraphQL API
      • Authentication
      • Examples
        • Audit Logs API
        • Bolt API
        • User Management API
        • Workspace Management API
    • Python SDK
      • Getting Started
      • Modules
        • Audit Log
        • Bolt
        • Lineage Diff
        • Custom Integration
        • User Management
        • Workspace Management
    • Paradime CLI
      • Getting Started
      • Bolt CLI
    • Webhooks
      • Getting Started
      • Custom Webhook Guides
        • Create an Azure DevOps Work item when a Bolt run complete with errors
        • Create a Linear Issue when a Bolt run complete with errors
        • Create a Jira Issue when a Bolt run complete with errors
        • Trigger a Slack notification when a Bolt run is overrunning
    • Virtual Environments
      • Using Poetry
      • Troubleshooting
    • API Keys
    • IP Restrictions in Paradime
    • Company & Workspace token
  • 🙌Best Practices
    • Data Mesh Setup
      • Configure Project dependencies
      • Model access
      • Model groups
  • ‼️Troubleshooting
    • Errors
    • Error List
    • Restart Code IDE
  • 🔗Other Links
    • Terms of Service
    • Privacy Policy
    • Paradime Blog
Powered by GitBook
On this page
  • Introduction
  • Prerequisites
  • Create a new Zap in Zapier
  • Configure a new Webhook in Paradime
  • Configure a Zapier delay
  • Store secrets
  • Add a Zapier code action
  • Configure a Zapier filter
  • Configure a Slack action
  • Test and deploy your Zap

Was this helpful?

  1. Developers
  2. Webhooks
  3. Custom Webhook Guides

Trigger a Slack notification when a Bolt run is overrunning

PreviousCreate a Jira Issue when a Bolt run complete with errorsNextVirtual Environments

Last updated 1 year ago

Was this helpful?

Introduction

This guide will show you how to set up an integration between Paradime and using Paradime Webhooks and .

At the end of this tutorial, you will be able to trigger a custom Slack notification when a Bolt schedule is overrunning including details of the Bolt schedule.

Prerequisites

For this integration, make sure you have the following:

  • A Paradime

  • Paradime Webhooks enabled

  • A account

  • A account

Create a new Zap in Zapier

To trigger an action each time a webhook event is delivered in Zapier, create a new Zap with Webhooks by Zapier as the Trigger and Catch Hook as the Event.

Press Continue, and copy the webhook URL. You will need this to connect Zapier to Paradime.

Configure a new Webhook in Paradime

In Paradime navigate to Account Settings and select Webhooks from the left Panel. Now you can click on + Add Endpoint to configure your webhook.

  • Enter the webhook URL generated by Zapier in the previews step

  • Add a description for your webhook integration (optional)

  • Filter to include in the event only the bolt.run.started event

When done, click on Create to setup the webhook.

Configure a Zapier delay

Now let's add a time delay before the pause the workflow for defined amount of time. Here we will set the threshold after which we will check if the Bolt schedule is still in progress.

Add a Delay by Zapier step in Zapier, choose the Delay For as the event type and configure your time delay. In the example below we are configuring to check the status of a Bolt schedule run after 120 minutes. Click Continue.

Store secrets

Choose Run Python as the Event. Run the following code:

store = StoreClient('abc123') #replace with your UUID secret
store.set('API_KEY', 'abc123') #replace with the Paradime API KEY
store.set('API_SECRET', 'abc123') #replace with the Paradime API SECRET

Add a Zapier code action

Now lets add a code action that will allow us to call the Paradime API and check the status of the Bolt run. Choose Code by Zapier for the application and select Run Python for the event.

In the Set up action section, you will need to add three Input Data that we will use the python code.

  • API_QUERY: here you will add the Paradime API to check the status of the Bolt run and use the ID field from the previous Catch Raw Hook step as as below:

query BoltRunStatus {
        boltRunStatus(runId: <ID> ) {
        ok
        state
    }
}
  • API_ENDPOINT: here map this input to the Api Url field from the previous Catch Raw Hook step as as below.

  • START_DTTM: here map this input to the Start Dttm field from the previous Catch Raw Hook step as as below.

In the Code field, now lets add the following python code and replace YOUR_SECRET_HERE with the secret you created when setting up the Storage by Zapier integration.

This code action, will called the Paradime API and check the status of the Bolt run and the elapsed time of the Bolt run.

import json
import requests
from datetime import datetime

# Access secret credentials
secret_store = StoreClient('YOUR_SECRET_HERE')
api_key = secret_store.get('API_KEY')
api_secret = secret_store.get('API_SECRET')


# Fetch zapier input data
api_endpoint = input_data['API_ENDPOINT']
api_query = input_data['API_QUERY']
start_time_str = input_data['START_DTTM']

headers = {
    'X-API-KEY': api_key, 
    'X-API-SECRET': api_secret, 
    'Content-Type': 'application/json'
}

# Convert start_time_str to a datetime object
start_time = datetime.fromisoformat(start_time_str)
# Calculate elapsed time
current_time = datetime.utcnow() 
elapsed_time = int(round((current_time - start_time).total_seconds() / 60))  # Rounds to the nearest minute

# Call the Paradime API endpoint to check the status of the Bolt Run
response = requests.post(api_endpoint, headers=headers, data=json.dumps({'query': api_query}))
data = response.json()
status = data.get('data', {}).get('boltRunStatus', {}).get('state', 'UNKNOWN')

output = {
    'status': status,
    'elapsed_time': elapsed_time
}

Configure a Zapier filter

Now let's add a Zapier filter and let the Zapier workflow continue only if the Bolt run status == RUNNING. This will ensure that the next step where we will be posting a Slack message, is not going to be triggered if the Bolt run completed.

Add a Only continue if... step in Zapier, choose the Status field from the webhook response and filter for the condition to filer when status exactly match RUNNING.

Click Continue. If the example Bolt run for your Zap setup matches above state, a confirmation message will display, indicating the Zap will proceed.

Configure a Slack action

Finally lets setup the Slack integration and configure our Slack message. Select Slack as the App, and Send Channel Message as the Action.

In the Action section, choose which Channel to post to.

You can now configure the Message Text field and create your own template Slack message or use the one in the example below. You can use all the available input data from the Catch Raw Hook and the preview Zapier code action.

Slack message template
--- Replace the variables below with the Input Data in Zapier

⚠️ Bolt Schedule <{{run_url}} | *{{schedule__name}}*> is still running after {{elapsed_time}} minutes
>{{commands_command}}

🏢 _Workspace name:_ {{workspace__name}}

⏲️ _Started at:_ {{start_dttm}}

👤 _Owner:_ {{schedule__owner_email}}

🕹️ _Trigger:_ {{environment__actor}}

🪪 _RunID:_ {{id}}

ℹ️ <{{run_url}} | View run logs in Paradime>

Configure the other options with:

  • Send as a bot: Yes

  • Bot Name: Paradime

  • Bot Icon using this url:https://20146536.fs1.hubspotusercontent-na1.net/hubfs/20146536/logo-no-text-light-transparent-3x.png

Test and deploy your Zap

In the next step, to check the status of the Bolt schedule, you will need to for your Paradime workspace.

Zapier allows you to . This prevents your keys from being displayed as plaintext in the Zap code. You can access them with the .

Create a Storage by Zapier connection

Go to , create a new connection and save the UUID secret you generate for later.

Add a temporary code step

When done, you can click on Continue and Test your Zap, and finally click on the Publish button to go live. With this Zap, if a Bolt run is still Running after 120 minutes, a Slack notification will be posted to the selected Slack channel .

💻
🙌
generate the API credentials
store secrets
StoreClient utility
​
https://zapier.com/app/connections/storage
​
Slack
Zapier
Bolt schedule configured
Slack
Zapier