Paradime Help Docs
Get Started
  • 🚀Introduction
  • 📃Guides
    • Paradime 101
      • Getting Started with your Paradime Workspace
        • Creating a Workspace
        • Setting Up Data Warehouse Connections
        • Managing workspace configurations
        • Managing Users in the Workspace
      • Getting Started with the Paradime IDE
        • Setting Up a dbt™ Project
        • Creating a dbt™ Model
        • Data Exploration in the Code IDE
        • DinoAI: Accelerating Your Analytics Engineering Workflow
          • DinoAI Agent
            • Creating dbt Sources from Data Warehouse
            • Generating Base Models
            • Building Intermediate/Marts Models
            • Documentation Generation
            • Data Pipeline Configuration
            • Using .dinorules to Tailor Your AI Experience
          • Accelerating GitOps
          • Accelerating Data Governance
          • Accelerating dbt™ Development
        • Utilizing Advanced Developer Features
          • Visualize Data Lineage
          • Auto-generated Data Documentation
          • Enforce SQL and YAML Best Practices
          • Working with CSV Files
      • Managing dbt™ Schedules with Bolt
        • Creating Bolt Schedules
        • Understanding schedule types and triggers
        • Viewing Run History and Analytics
        • Setting Up Notifications
        • Debugging Failed Runs
    • Migrating from dbt™ cloud to Paradime
  • 🔍Concepts
    • Working with Git
      • Git Lite
      • Git Advanced
      • Read Only Branches
      • Delete Branches
      • Merge Conflicts
      • Configuring Signed Commits on Paradime with SSH Keys
    • dbt™ fundamentals
      • Getting started with dbt™
        • Introduction
        • Project Strucuture
        • Working with Sources
        • Testing Data Quality
        • Models and Transformations
      • Configuring your dbt™ Project
        • Setting up your dbt_project.yml
        • Defining Your Sources in sources.yml
        • Testing Source Freshness
        • Unit Testing
        • Working with Tags
        • Managing Seeds
        • Environment Management
        • Variables and Parameters
        • Macros
        • Custom Tests
        • Hooks & Operational Tasks
        • Packages
      • Model Materializations
        • Table Materialization
        • View​ Materialization
        • Incremental Materialization
          • Using Merge for Incremental Models
          • Using Delete+Insert for Incremental Models
          • Using Append for Incremental Models
          • Using Microbatch for Incremental Models
        • Ephemeral Materialization
        • Snapshots
      • Running dbt™
        • Mastering the dbt™ CLI
          • Commands
          • Methods
          • Selector Methods
          • Graph Operators
    • Paradime fundamentals
      • Global Search
        • Paradime Apps Navigation
        • Invite users to your workspace
        • Search and preview Bolt schedules status
      • Using --defer in Paradime
      • Workspaces and data mesh
    • Data Warehouse essentials
      • BigQuery Multi-Project Service Account
  • 📖Documentation
    • DinoAI
      • Agent Mode
        • Use Cases
          • Creating Sources from your Warehouse
          • Generating dbt™ models
          • Fixing Errors with Jira
          • Researching with Perplexity
          • Providing Additional Context Using PDFs
      • Context
        • File Context
        • Directory Context
      • Tools and Features
        • Warehouse Tool
        • File System Tool
        • PDF Tool
        • Jira Tool
        • Perplexity Tool
        • Terminal Tool
        • Coming Soon Tools...
      • .dinorules
      • Ask Mode
      • Version Control
      • Production Pipelines
      • Data Documentation
    • Code IDE
      • User interface
        • Autocompletion
        • Context Menu
        • Flexible layout
        • "Peek" and "Go To" Definition
        • IDE preferences
        • Shortcuts
      • Left Panel
        • DinoAI Coplot
        • Search, Find, and Replace
        • Git Lite
        • Bookmarks
      • Command Panel
        • Data Explorer
        • Lineage
        • Catalog
        • Lint
      • Terminal
        • Running dbt™
        • Paradime CLI
      • Additional Features
        • Scratchpad
    • Bolt
      • Creating Schedules
        • 1. Schedule Settings
        • 2. Command Settings
          • dbt™ Commands
          • Python Scripts
          • Elementary Commands
          • Lightdash Commands
          • Tableau Workbook Refresh
          • Power BI Dataset Refresh
          • Paradime Bolt Schedule Toggle Commands
          • Monte Carlo Commands
        • 3. Trigger Types
        • 4. Notification Settings
        • Templates
          • Run and Test all your dbt™ Models
          • Snapshot Source Data Freshness
          • Build and Test Models with New Source Data
          • Test Code Changes On Pull Requests
          • Re-executes the last dbt™ command from the point of failure
          • Deploy Code Changes On Merge
          • Create Jira Tickets
          • Trigger Census Syncs
          • Trigger Hex Projects
          • Create Linear Issues
          • Create New Relic Incidents
          • Create Azure DevOps Items
        • Schedules as Code
      • Managing Schedules
        • Schedule Configurations
        • Viewing Run Log History
        • Analyzing Individual Run Details
          • Configuring Source Freshness
      • Bolt API
      • Special Environment Variables
        • Audit environment variables
        • Runtime environment variables
      • Integrations
        • Reverse ETL
          • Hightouch
        • Orchestration
          • Airflow
          • Azure Data Factory (ADF)
      • CI/CD
        • Turbo CI
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
          • Paradime Turbo CI Schema Cleanup
        • Continuous Deployment with Bolt
          • GitHub Native Continuous Deployment
          • Using Azure Pipelines
          • Using BitBucket Pipelines
          • Using GitLab Pipelines
        • Column-Level Lineage Diff
          • dbt™ mesh
          • Looker
          • Tableau
          • Thoughtspot
    • Radar
      • Get Started
      • Cost Management
        • Snowflake Cost Optimization
        • Snowflake Cost Monitoring
        • BigQuery Cost Monitoring
      • dbt™ Monitoring
        • Schedules Dashboard
        • Models Dashboard
        • Sources Dashboard
        • Tests Dashboard
      • Team Efficiency Tracking
      • Real-time Alerting
      • Looker Monitoring
    • Data Catalog
      • Data Assets
        • Looker assets
        • Tableau assets
        • Power BI assets
        • Sigma assets
        • ThoughtSpot assets
        • Fivetran assets
        • dbt™️ assets
      • Lineage
        • Search and Discovery
        • Filters and Nodes interaction
        • Nodes navigation
        • Canvas interactions
        • Compare Lineage version
    • Integrations
      • Dashboards
        • Sigma
        • ThoughtSpot (Beta)
        • Lightdash
        • Tableau
        • Looker
        • Power BI
        • Streamlit
      • Code IDE
        • Cube CLI
        • dbt™️ generator
        • Prettier
        • Harlequin
        • SQLFluff
        • Rainbow CSV
        • Mermaid
          • Architecture Diagrams
          • Block Diagrams Documentation
          • Class Diagrams
          • Entity Relationship Diagrams
          • Gantt Diagrams
          • GitGraph Diagrams
          • Mindmaps
          • Pie Chart Diagrams
          • Quadrant Charts
          • Requirement Diagrams
          • Sankey Diagrams
          • Sequence Diagrams
          • State Diagrams
          • Timeline Diagrams
          • User Journey Diagrams
          • XY Chart
          • ZenUML
        • pre-commit
          • Paradime Setup and Configuration
          • dbt™️-checkpoint hooks
            • dbt™️ Model checks
            • dbt™️ Script checks
            • dbt™️ Source checks
            • dbt™️ Macro checks
            • dbt™️ Modifiers
            • dbt™️ commands
            • dbt™️ checks
          • SQLFluff hooks
          • Prettier hooks
      • Observability
        • Elementary Data
          • Anomaly Detection Tests
            • Anomaly tests parameters
            • Volume anomalies
            • Freshness anomalies
            • Event freshness anomalies
            • Dimension anomalies
            • All columns anomalies
            • Column anomalies
          • Schema Tests
            • Schema changes
            • Schema changes from baseline
          • Sending alerts
            • Slack alerts
            • Microsoft Teams alerts
            • Alerts Configuration and Customization
          • Generate observability report
          • CLI commands and usage
        • Monte Carlo
      • Storage
        • Amazon S3
        • Snowflake Storage
      • Reverse ETL
        • Hightouch
      • CI/CD
        • GitHub
        • Spectacles
      • Notifications
        • Microsoft Teams
        • Slack
      • ETL
        • Fivetran
    • Security
      • Single Sign On (SSO)
        • Okta SSO
        • Azure AD SSO
        • Google SAML SSO
        • Google Workspace SSO
        • JumpCloud SSO
      • Audit Logs
      • Security model
      • Privacy model
      • FAQs
      • Trust Center
      • Security
    • Settings
      • Workspaces
      • Git Repositories
        • Importing a repository
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
        • Update connected git repository
      • Connections
        • Code IDE environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Scheduler environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Manage connections
          • Set alternative default connection
          • Delete connections
        • Cost connection
          • BigQuery cost connection
          • Snowflake cost connection
        • Connection Security
          • AWS PrivateLink
            • Snowflake PrivateLink
            • Redshift PrivateLink
          • BigQuery OAuth
          • Snowflake OAuth
        • Optional connection attributes
      • Notifications
      • dbt™
        • Upgrade dbt Core™ version
      • Users
        • Invite users
        • Manage Users
        • Enable Auto-join
        • Users and licences
        • Default Roles and Permissions
        • Role-based access control
      • Environment Variables
        • Bolt Schedules Environment Variables
        • Code IDE Environment Variables
  • 💻Developers
    • GraphQL API
      • Authentication
      • Examples
        • Audit Logs API
        • Bolt API
        • User Management API
        • Workspace Management API
    • Python SDK
      • Getting Started
      • Modules
        • Audit Log
        • Bolt
        • Lineage Diff
        • Custom Integration
        • User Management
        • Workspace Management
    • Paradime CLI
      • Getting Started
      • Bolt CLI
    • Webhooks
      • Getting Started
      • Custom Webhook Guides
        • Create an Azure DevOps Work item when a Bolt run complete with errors
        • Create a Linear Issue when a Bolt run complete with errors
        • Create a Jira Issue when a Bolt run complete with errors
        • Trigger a Slack notification when a Bolt run is overrunning
    • Virtual Environments
      • Using Poetry
      • Troubleshooting
    • API Keys
    • IP Restrictions in Paradime
    • Company & Workspace token
  • 🙌Best Practices
    • Data Mesh Setup
      • Configure Project dependencies
      • Model access
      • Model groups
  • ‼️Troubleshooting
    • Errors
    • Error List
    • Restart Code IDE
  • 🔗Other Links
    • Terms of Service
    • Privacy Policy
    • Paradime Blog
Powered by GitBook
On this page
  • Overview
  • List Bolt schedules
  • Get Bolt schedule details
  • Trigger a Bolt run
  • Trigger a Bolt run with custom commands
  • Trigger a Bolt run with a custom git branch
  • Cancel a Bolt run
  • Get Bolt run status
  • Get Bolt command details
  • Get Bolt command resource URL

Was this helpful?

  1. Developers
  2. GraphQL API
  3. Examples

Bolt API

PreviousAudit Logs APINextUser Management API

Last updated 8 months ago

Was this helpful?

Overview

  • This feature is available with the .

  • Your API keys must have either capabilities.

The Bolt API allows you to easily manage and control Bolt schedules and runs within your workspace.

List Bolt schedules

This endpoint will return the active Bolt schedules in your workspace.

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
query ListBoltSchedules {
    listBoltSchedules(offset: 0, limit: 10, showInactive: false) {
        schedules {
            name
            schedule
            owner
            lastRunAt
            lastRunState
            nextRunAt
            id
            uuid
            source
            turboCi {
                enabled
                deferredScheduleName
                successfulRunOnly
            }
            deferredSchedule {
                deferredScheduleName
                enabled
                successfulRunOnly
            }
            commands
            gitBranch
            slackOn
            slackNotify
            emailOn
            emailNotify
        }
        totalCount
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "query ListBoltSchedules($offset: Int!, $limit: Int!, $showInactive: Boolean!) { listBoltSchedules(offset: $offset, limit: $limit, showInactive: $showInactive) { schedules { name schedule owner lastRunAt lastRunState nextRunAt id uuid source turboCi { enabled deferredScheduleName successfulRunOnly } deferredSchedule { deferredScheduleName enabled successfulRunOnly } commands gitBranch slackOn slackNotify emailOn emailNotify } totalCount } }",
       "variables": {
         "offset": 0,
         "limit": 10,
         "showInactive": false
       }
     }'
Example response
{
  "data": {
    "listBoltSchedules": {
      "schedules": [
        {
          "name": "fabio-chld-yaml",
          "schedule": "OFF",
          "owner": "fabio@paradime.io",
          "lastRunAt": null,
          "lastRunState": null,
          "nextRunAt": null,
          "id": 973767,
          "uuid": "f5f98a76-c978-31c5-9296-aae9903653f4",
          "source": "yaml",
          "turboCi": null,
          "deferredSchedule": null,
          "commands": ["dbt run"],
          "gitBranch": "main",
          "slackOn": [""],
          "slackNotify": [""],
          "emailOn": [""],
          "emailNotify": [""]
        }
      ],
      "totalCount": 26
    }
  }
}

Get Bolt schedule details

This endpoint will enable you to check the status of a schedule by passing a Bolt scheduleName .

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
query BoltScheduleName {
    boltScheduleName(scheduleName: "daily run") {
        ok
        latestRunId
        commands
        owner
        schedule
        uuid
        source
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
  
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "query BoltScheduleName($scheduleName: String!) { boltScheduleName(scheduleName: $scheduleName) { ok latestRunId commands owner schedule uuid source } }",
       "variables": {
         "scheduleName": "daily run"
       }
     }'
Example response
{
  "data": {
    "boltScheduleName": {
      "ok": true,
      "latestRunId": 15475,
      "commands": ["dbt test"],
      "owner": "john@acme.io",
      "schedule": "OFF",
      "uuid": "a3d6ceea-abe3-333e-ac8b-c0b48cce5678",
      "source": "ui"
    }
  }
}

Trigger a Bolt run

This endpoint will enable you to trigger a Bolt schedule run by passing a schedule name.

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
mutation TriggerBoltRun {
    triggerBoltRun(scheduleName: "daily run") {
        runId
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "mutation TriggerBoltRun($scheduleName: String!) { triggerBoltRun(scheduleName: $scheduleName) { runId } }",
       "variables": {
         "scheduleName": "daily run"
       }
     }'
Example response
{
  "data": {
    "triggerBoltRun": {
      "runId": 15477
    }
  }
}

Trigger a Bolt run with custom commands

This endpoint will enable you to trigger a Bolt schedule with a custom command and overwrite the actual commands defined in the schedule for that particular run.

This only modifies the command at runtime for the triggered Bolt schedule and not the commands configuration defined in the schedule.

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
mutation TriggerBoltRun {
    triggerBoltRun(scheduleName: "daily run", commands: ["dbt compile", "dbt test"]) {
        runId
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
  
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "mutation TriggerBoltRun($scheduleName: String!, $commands: [String!]) { triggerBoltRun(scheduleName: $scheduleName, commands: $commands) { runId } }",
       "variables": {
         "scheduleName": "daily run",
         "commands": ["dbt compile", "dbt test"]
       }
     }'
Example response
{
  "data": {
    "triggerBoltRun": {
      "runId": 15483
    }
  }
}

Trigger a Bolt run with a custom git branch

This endpoint will enable you to trigger a Bolt schedule with a custom git commit and overwrite the branch name defined in the schedule configuration.

This only modifies the commit at runtime for the triggered Bolt schedule and not the branch name defined in the schedule.

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
mutation TriggerBoltRun {
    triggerBoltRun(scheduleName: "daily run", branch: "feature-branch-123") {
        runId
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "mutation TriggerBoltRun($scheduleName: String!, $branch: String!) { triggerBoltRun(scheduleName: $scheduleName, branch: $branch) { runId } }",
       "variables": {
         "scheduleName": "daily run",
         "branch": "feature-branch-123"
       }
     }'
Example response
{
  "data": {
    "triggerBoltRun": {
      "runId": 15483
    }
  }
}

Cancel a Bolt run

This endpoint will enable you to cancel a Bolt run by passing the runID of a Bolt schedule.

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
mutation CancelBoltRun {
    cancelBoltRun(scheduleRunId: 15507) {
        ok
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
  
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "mutation CancelBoltRun($scheduleRunId: Int!) { cancelBoltRun(scheduleRunId: $scheduleRunId) { ok } }",
       "variables": {
         "scheduleRunId": 15507
       }
     }'
Example response
{
  "data": {
    "cancelBoltRun": {
      "ok": true
    }
  }
}

Get Bolt run status

This endpoint will enable you to check the status of a Bolt run run by passing the runID.

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
query BoltRunStatus {
    boltRunStatus(runId: 15509) {
        ok
        state
        commands {
            id
            command
            startDttm
            endDttm
            stdout
            stderr
            returnCode
        }
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "query BoltRunStatus($runId: Int!) { boltRunStatus(runId: $runId) { ok state commands { id command startDttm endDttm stdout stderr returnCode } } }",
       "variables": {
         "runId": 15509
       }
     }'
Example response
{
  "data": {
    "boltRunStatus": {
      "ok": true,
      "state": "ERROR",
      "commands": [
        {
          "id": 59241,
          "command": "dbt test",
          "startDttm": "2024-08-27 22:59:01.476387",
          "endDttm": "2024-08-27 22:59:11.323692",
          "stdout": "the command logs",
          "stderr": "the command logs",
          "returnCode": 1
        },
        {
          "id": 59239,
          "command": "git clone ...",
          "startDttm": "2024-08-27 22:58:57.738046",
          "endDttm": "2024-08-27 22:58:59.135456",
          "stdout": "",
          "stderr": "the command logs",
          "returnCode": -10
        },
        {
          "id": 59240,
          "command": "dbt deps",
          "startDttm": "2024-08-27 22:58:59.234308",
          "endDttm": "2024-08-27 22:59:01.468001",
          "stdout": "the command logs",
          "stderr": "the command logs",
          "returnCode": -10
        }
      ]
    }
  }
}

Get Bolt command details

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
query BoltCommand {
    boltCommand(commandId: 59241) {
        command
        startDttm
        endDttm
        stdout
        stderr
        returnCode
        scheduleRunId
        resources {
            id
            path
        }
        ok
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
  
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "query BoltCommand($commandId: Int!) { boltCommand(commandId: $commandId) { command startDttm endDttm stdout stderr returnCode scheduleRunId resources { id path } ok } }",
       "variables": {
         "commandId": 59241
       }
     }'
Example response
{
  "data": {
    "boltCommand": {
      "command": "dbt test",
      "startDttm": "2024-08-27 22:59:01.476387",
      "endDttm": "2024-08-27 22:59:11.323692",
      "stdout": "your command log",
      "stderr": "your command log",
      "returnCode": 1,
      "scheduleRunId": 15509,
      "resources": [
        {"id": 306227, "path": "target/partial_parse.msgpack"},
        {"id": 306228, "path": "target/manifest.json"},
        {"id": 306229, "path": "target/semantic_manifest.json"},
        {"id": 306230, "path": "target/graph_summary.json"},
        {"id": 306231, "path": "target/graph.gpickle"},
        {"id": 306232, "path": "target/compiled/demo_sales_project/data/schema.yml/not_null_customer_conversions_revenue.sql"},
        {"id": 306233, "path": "target/compiled/demo_sales_project/data/schema.yml/not_null_customer_conversions_converted_at.sql"},
        {"id": 306234, "path": "target/compiled/demo_sales_project/data/schema.yml/not_null_customer_conversions_customer_id.sql"},
        {"id": 306235, "path": "target/compiled/demo_sales_project/data/schema.yml/not_null_sessions_customer_id.sql"},
        {"id": 306236, "path": "target/compiled/demo_sales_project/data/schema.yml/not_null_sessions_ended_at.sql"},
        {"id": 306237, "path": "target/compiled/demo_sales_project/data/schema.yml/not_null_sessions_started_at.sql"},
        {"id": 306238, "path": "target/compiled/demo_sales_project/data/schema.yml/unique_customer_conversions_customer_id.sql"},
        {"id": 306239, "path": "target/compiled/demo_sales_project/models/staging/schema.yml/accepted_values_stg_payments_c7909fb19b1f0177c2bf99c7912f06ef.sql"},
        {"id": 306240, "path": "target/compiled/demo_sales_project/models/staging/schema.yml/accepted_values_stg_orders_status__placed.sql"},
        {"id": 306241, "path": "target/compiled/demo_sales_project/models/staging/schema.yml/not_null_stg_payments_payment_id.sql"},
        {"id": 306242, "path": "target/compiled/demo_sales_project/models/staging/schema.yml/not_null_stg_customers_customer_id.sql"},
        {"id": 306243, "path": "target/compiled/demo_sales_project/models/staging/schema.yml/unique_stg_customers_customer_id.sql"},
        {"id": 306244, "path": "target/compiled/demo_sales_project/models/staging/schema.yml/unique_stg_payments_payment_id.sql"},
        {"id": 306245, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/accepted_values_order_items_7244335ae3a9655e09872a7a5b8cb110.sql"},
        {"id": 306246, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_amount.sql"},
        {"id": 306247, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_coupon_amount.sql"},
        {"id": 306248, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_bank_transfer_amount.sql"},
        {"id": 306249, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_credit_card_amount.sql"},
        {"id": 306250, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_customer_id.sql"},
        {"id": 306251, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_order_id.sql"},
        {"id": 306252, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_gift_card_amount.sql"},
        {"id": 306253, "path": "target/compiled/demo_sales_project/models/marts/core/schema.yml/unique_order_items_order_id.sql"},
        {"id": 306254, "path": "target/compiled/demo_sales_project/models/marts/core/intermediate/schema.yml/accepted_values_order_payments_b6878290bdd2ef4d6ef0513a1d8fdbbc.sql"},
        {"id": 306255, "path": "target/compiled/demo_sales_project/models/marts/core/intermediate/schema.yml/not_null_order_payments_payment_id.sql"},
        {"id": 306256, "path": "target/compiled/demo_sales_project/models/marts/core/intermediate/schema.yml/relationships_order_payments_1c8c3f46d5739a85e060a829410bf06d.sql"},
        {"id": 306257, "path": "target/compiled/demo_sales_project/models/marts/core/intermediate/schema.yml/unique_order_payments_payment_id.sql"},
        {"id": 306258, "path": "target/compiled/demo_sales_project/tests/assert_total_payment_amount_is_positive.sql"},
        {"id": 306259, "path": "target/run/demo_sales_project/data/schema.yml/not_null_customer_conversions_converted_at.sql"},
        {"id": 306260, "path": "target/run/demo_sales_project/data/schema.yml/not_null_customer_conversions_revenue.sql"},
        {"id": 306261, "path": "target/run/demo_sales_project/data/schema.yml/not_null_customer_conversions_customer_id.sql"},
        {"id": 306262, "path": "target/run/demo_sales_project/data/schema.yml/not_null_sessions_customer_id.sql"},
        {"id": 306263, "path": "target/run/demo_sales_project/data/schema.yml/not_null_sessions_ended_at.sql"},
        {"id": 306264, "path": "target/run/demo_sales_project/data/schema.yml/not_null_sessions_started_at.sql"},
        {"id": 306265, "path": "target/run/demo_sales_project/data/schema.yml/unique_customer_conversions_customer_id.sql"},
        {"id": 306266, "path": "target/run/demo_sales_project/tests/assert_total_payment_amount_is_positive.sql"},
        {"id": 306267, "path": "target/run/demo_sales_project/models/staging/schema.yml/accepted_values_stg_payments_c7909fb19b1f0177c2bf99c7912f06ef.sql"},
        {"id": 306268, "path": "target/run/demo_sales_project/models/staging/schema.yml/accepted_values_stg_orders_status__placed.sql"},
        {"id": 306269, "path": "target/run/demo_sales_project/models/staging/schema.yml/not_null_stg_payments_payment_id.sql"},
        {"id": 306270, "path": "target/run/demo_sales_project/models/staging/schema.yml/not_null_stg_customers_customer_id.sql"},
        {"id": 306271, "path": "target/run/demo_sales_project/models/staging/schema.yml/unique_stg_customers_customer_id.sql"},
        {"id": 306272, "path": "target/run/demo_sales_project/models/staging/schema.yml/unique_stg_payments_payment_id.sql"},
        {"id": 306273, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/accepted_values_order_items_7244335ae3a9655e09872a7a5b8cb110.sql"},
        {"id": 306274, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_amount.sql"},
        {"id": 306275, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_coupon_amount.sql"},
        {"id": 306276, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_bank_transfer_amount.sql"},
        {"id": 306277, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_credit_card_amount.sql"},
        {"id": 306278, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_customer_id.sql"},
        {"id": 306279, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_gift_card_amount.sql"},
        {"id": 306280, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/not_null_order_items_order_id.sql"},
        {"id": 306281, "path": "target/run/demo_sales_project/models/marts/core/schema.yml/unique_order_items_order_id.sql"},
        {"id": 306282, "path": "target/run/demo_sales_project/models/marts/core/intermediate/schema.yml/accepted_values_order_payments_b6878290bdd2ef4d6ef0513a1d8fdbbc.sql"},
        {"id": 306283, "path": "target/run/demo_sales_project/models/marts/core/intermediate/schema.yml/not_null_order_payments_payment_id.sql"},
        {"id": 306284, "path": "target/run/demo_sales_project/models/marts/core/intermediate/schema.yml/relationships_order_payments_1c8c3f46d5739a85e060a829410bf06d.sql"},
        {"id": 306285, "path": "target/run/demo_sales_project/models/marts/core/intermediate/schema.yml/unique_order_payments_payment_id.sql"},
        {"id": 306286, "path": "target/run_results.json"},
        {"id": 306287, "path": "logs/dbt.log"}
      ],
      "ok": true
    }
  }
}

Get Bolt command resource URL

Example Request

import requests
import os

# API credentials
api_endpoint = "<YOUR_API_ENDPOINT>"
api_key = "<YOUR_API_KEY>"
api_secret = "<YOUR_API_SECRET>"

graphql_query = """
query BoltResourceUrl {
    boltResourceUrl(resourceId: 306228) {
        ok
        url
    }
}
  """
  
response = requests.post(api_endpoint, json={"query": graphql_query}, headers={
      "Content-Type": "application/json",
      "X-API-KEY": api_key,
      "X-API-SECRET": api_secret,
  })

print(response.json())
curl -X POST "<YOUR_API_ENDPOINT>" \
     -H "Content-Type: application/json" \
     -H "X-API-KEY: <YOUR_API_KEY>" \
     -H "X-API-SECRET: <YOUR_API_SECRET>" \
     -d '{
       "query": "query BoltResourceUrl($resourceId: Int!) { boltResourceUrl(resourceId: $resourceId) { ok url } }",
       "variables": {
         "resourceId": 306228
       }
     }'
Example response
{
  "data": {
    "boltResourceUrl": {
      "ok": true,
      "url": "the url to the resource e.g. manifest.json"
    }
  }
}

This endpoint will enable you to extract for a given command all the related details including raw error logs by passing a commandId. This is normally used in conjunction with the

This endpoint will enable you to extract for a given command the related resource generated by the execution of the command, for example the run_results.json or the manifest.json by passing a resourceId. This is normally used in conjunction with the

💻
Paradime Bolt plan
Bolt Schedules Admin or Bolt Schedules Metadata Viewer
Paradime Webhooks.
Paradime Webhooks.