Paradime Help Docs
Get Started
  • 🚀Introduction
  • 📃Guides
    • Paradime 101
      • Getting Started with your Paradime Workspace
        • Creating a Workspace
        • Setting Up Data Warehouse Connections
        • Managing workspace configurations
        • Managing Users in the Workspace
      • Getting Started with the Paradime IDE
        • Setting Up a dbt™ Project
        • Creating a dbt™ Model
        • Data Exploration in the Code IDE
        • DinoAI: Accelerating Your Analytics Engineering Workflow
          • DinoAI Agent
            • Creating dbt Sources from Data Warehouse
            • Generating Base Models
            • Building Intermediate/Marts Models
            • Documentation Generation
            • Data Pipeline Configuration
            • Using .dinorules to Tailor Your AI Experience
          • Accelerating GitOps
          • Accelerating Data Governance
          • Accelerating dbt™ Development
        • Utilizing Advanced Developer Features
          • Visualize Data Lineage
          • Auto-generated Data Documentation
          • Enforce SQL and YAML Best Practices
          • Working with CSV Files
      • Managing dbt™ Schedules with Bolt
        • Creating Bolt Schedules
        • Understanding schedule types and triggers
        • Viewing Run History and Analytics
        • Setting Up Notifications
        • Debugging Failed Runs
    • Migrating from dbt™ cloud to Paradime
  • 🔍Concepts
    • Working with Git
      • Git Lite
      • Git Advanced
      • Read Only Branches
      • Delete Branches
      • Merge Conflicts
      • Configuring Signed Commits on Paradime with SSH Keys
      • GitHub Branch Protection Guide: Preventing Direct Commits to Main
    • dbt™ fundamentals
      • Getting started with dbt™
        • Introduction
        • Project Strucuture
        • Working with Sources
        • Testing Data Quality
        • Models and Transformations
      • Configuring your dbt™ Project
        • Setting up your dbt_project.yml
        • Defining Your Sources in sources.yml
        • Testing Source Freshness
        • Unit Testing
        • Working with Tags
        • Managing Seeds
        • Environment Management
        • Variables and Parameters
        • Macros
        • Custom Tests
        • Hooks & Operational Tasks
        • Packages
      • Model Materializations
        • Table Materialization
        • View​ Materialization
        • Incremental Materialization
          • Using Merge for Incremental Models
          • Using Delete+Insert for Incremental Models
          • Using Append for Incremental Models
          • Using Microbatch for Incremental Models
        • Ephemeral Materialization
        • Snapshots
      • Running dbt™
        • Mastering the dbt™ CLI
          • Commands
          • Methods
          • Selector Methods
          • Graph Operators
    • Paradime fundamentals
      • Global Search
        • Paradime Apps Navigation
        • Invite users to your workspace
        • Search and preview Bolt schedules status
      • Using --defer in Paradime
      • Workspaces and data mesh
    • Data Warehouse essentials
      • BigQuery Multi-Project Service Account
  • 📖Documentation
    • DinoAI
      • Agent Mode
        • Use Cases
          • Creating Sources from your Warehouse
          • Generating dbt™ models
          • Fixing Errors with Jira
          • Researching with Perplexity
          • Providing Additional Context Using PDFs
      • Context
        • File Context
        • Directory Context
      • Tools and Features
        • Warehouse Tool
        • File System Tool
        • PDF Tool
        • Jira Tool
        • Perplexity Tool
        • Terminal Tool
        • Coming Soon Tools...
      • .dinorules
      • Ask Mode
      • Version Control
      • Production Pipelines
      • Data Documentation
    • Code IDE
      • User interface
        • Autocompletion
        • Context Menu
        • Flexible layout
        • "Peek" and "Go To" Definition
        • IDE preferences
        • Shortcuts
      • Left Panel
        • DinoAI Coplot
        • Search, Find, and Replace
        • Git Lite
        • Bookmarks
      • Command Panel
        • Data Explorer
        • Lineage
        • Catalog
        • Lint
      • Terminal
        • Running dbt™
        • Paradime CLI
      • Additional Features
        • Scratchpad
    • Bolt
      • Creating Schedules
        • 1. Schedule Settings
        • 2. Command Settings
          • dbt™ Commands
          • Python Scripts
          • Elementary Commands
          • Lightdash Commands
          • Tableau Workbook Refresh
          • Power BI Dataset Refresh
          • Paradime Bolt Schedule Toggle Commands
          • Monte Carlo Commands
        • 3. Trigger Types
        • 4. Notification Settings
        • Templates
          • Run and Test all your dbt™ Models
          • Snapshot Source Data Freshness
          • Build and Test Models with New Source Data
          • Test Code Changes On Pull Requests
          • Re-executes the last dbt™ command from the point of failure
          • Deploy Code Changes On Merge
          • Create Jira Tickets
          • Trigger Census Syncs
          • Trigger Hex Projects
          • Create Linear Issues
          • Create New Relic Incidents
          • Create Azure DevOps Items
        • Schedules as Code
      • Managing Schedules
        • Schedule Configurations
        • Viewing Run Log History
        • Analyzing Individual Run Details
          • Configuring Source Freshness
      • Bolt API
      • Special Environment Variables
        • Audit environment variables
        • Runtime environment variables
      • Integrations
        • Reverse ETL
          • Hightouch
        • Orchestration
          • Airflow
          • Azure Data Factory (ADF)
      • CI/CD
        • Turbo CI
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
          • Paradime Turbo CI Schema Cleanup
        • Continuous Deployment with Bolt
          • GitHub Native Continuous Deployment
          • Using Azure Pipelines
          • Using BitBucket Pipelines
          • Using GitLab Pipelines
        • Column-Level Lineage Diff
          • dbt™ mesh
          • Looker
          • Tableau
          • Thoughtspot
    • Radar
      • Get Started
      • Cost Management
        • Snowflake Cost Optimization
        • Snowflake Cost Monitoring
        • BigQuery Cost Monitoring
      • dbt™ Monitoring
        • Schedules Dashboard
        • Models Dashboard
        • Sources Dashboard
        • Tests Dashboard
      • Team Efficiency Tracking
      • Real-time Alerting
      • Looker Monitoring
    • Data Catalog
      • Data Assets
        • Looker assets
        • Tableau assets
        • Power BI assets
        • Sigma assets
        • ThoughtSpot assets
        • Fivetran assets
        • dbt™️ assets
      • Lineage
        • Search and Discovery
        • Filters and Nodes interaction
        • Nodes navigation
        • Canvas interactions
        • Compare Lineage version
    • Integrations
      • Dashboards
        • Sigma
        • ThoughtSpot (Beta)
        • Lightdash
        • Tableau
        • Looker
        • Power BI
        • Streamlit
      • Code IDE
        • Cube CLI
        • dbt™️ generator
        • Prettier
        • Harlequin
        • SQLFluff
        • Rainbow CSV
        • Mermaid
          • Architecture Diagrams
          • Block Diagrams Documentation
          • Class Diagrams
          • Entity Relationship Diagrams
          • Gantt Diagrams
          • GitGraph Diagrams
          • Mindmaps
          • Pie Chart Diagrams
          • Quadrant Charts
          • Requirement Diagrams
          • Sankey Diagrams
          • Sequence Diagrams
          • State Diagrams
          • Timeline Diagrams
          • User Journey Diagrams
          • XY Chart
          • ZenUML
        • pre-commit
          • Paradime Setup and Configuration
          • dbt™️-checkpoint hooks
            • dbt™️ Model checks
            • dbt™️ Script checks
            • dbt™️ Source checks
            • dbt™️ Macro checks
            • dbt™️ Modifiers
            • dbt™️ commands
            • dbt™️ checks
          • SQLFluff hooks
          • Prettier hooks
      • Observability
        • Elementary Data
          • Anomaly Detection Tests
            • Anomaly tests parameters
            • Volume anomalies
            • Freshness anomalies
            • Event freshness anomalies
            • Dimension anomalies
            • All columns anomalies
            • Column anomalies
          • Schema Tests
            • Schema changes
            • Schema changes from baseline
          • Sending alerts
            • Slack alerts
            • Microsoft Teams alerts
            • Alerts Configuration and Customization
          • Generate observability report
          • CLI commands and usage
        • Monte Carlo
      • Storage
        • Amazon S3
        • Snowflake Storage
      • Reverse ETL
        • Hightouch
      • CI/CD
        • GitHub
        • Spectacles
      • Notifications
        • Microsoft Teams
        • Slack
      • ETL
        • Fivetran
    • Security
      • Single Sign On (SSO)
        • Okta SSO
        • Azure AD SSO
        • Google SAML SSO
        • Google Workspace SSO
        • JumpCloud SSO
      • Audit Logs
      • Security model
      • Privacy model
      • FAQs
      • Trust Center
      • Security
    • Settings
      • Workspaces
      • Git Repositories
        • Importing a repository
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
        • Update connected git repository
      • Connections
        • Code IDE environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Scheduler environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Manage connections
          • Set alternative default connection
          • Delete connections
        • Cost connection
          • BigQuery cost connection
          • Snowflake cost connection
        • Connection Security
          • AWS PrivateLink
            • Snowflake PrivateLink
            • Redshift PrivateLink
          • BigQuery OAuth
          • Snowflake OAuth
        • Optional connection attributes
      • Notifications
      • dbt™
        • Upgrade dbt Core™ version
      • Users
        • Invite users
        • Manage Users
        • Enable Auto-join
        • Users and licences
        • Default Roles and Permissions
        • Role-based access control
      • Environment Variables
        • Bolt Schedules Environment Variables
        • Code IDE Environment Variables
  • 💻Developers
    • GraphQL API
      • Authentication
      • Examples
        • Audit Logs API
        • Bolt API
        • User Management API
        • Workspace Management API
    • Python SDK
      • Getting Started
      • Modules
        • Audit Log
        • Bolt
        • Lineage Diff
        • Custom Integration
        • User Management
        • Workspace Management
    • Paradime CLI
      • Getting Started
      • Bolt CLI
    • Webhooks
      • Getting Started
      • Custom Webhook Guides
        • Create an Azure DevOps Work item when a Bolt run complete with errors
        • Create a Linear Issue when a Bolt run complete with errors
        • Create a Jira Issue when a Bolt run complete with errors
        • Trigger a Slack notification when a Bolt run is overrunning
    • Virtual Environments
      • Using Poetry
      • Troubleshooting
    • API Keys
    • IP Restrictions in Paradime
    • Company & Workspace token
  • 🙌Best Practices
    • Data Mesh Setup
      • Configure Project dependencies
      • Model access
      • Model groups
  • ‼️Troubleshooting
    • Errors
    • Error List
    • Restart Code IDE
  • 🔗Other Links
    • Terms of Service
    • Privacy Policy
    • Paradime Blog
Powered by GitBook
On this page
  • Overview
  • Prerequisites
  • Paradime Turbo CI configuration
  • LookML configuration
  • Spectacles configuration
  • Set user attributes
  • Set up a new Suite​
  • Connect Spectacles.dev to your Paradime workspace
  • What you need to configure the Spectacles integration?
  • Setup the Spectacles integration in your Paradime workspace

Was this helpful?

  1. Documentation
  2. Integrations
  3. CI/CD

Spectacles

PreviousGitHubNextNotifications

Last updated 3 months ago

Was this helpful?

Overview

This documentation provides a detailed guide on integrating Spectacles.dev with Paradime to enhance continuous integration (CI) workflows for your dbt project hosted on GitHub.

This integration enables automated LookML testing and seamless deployment processes, ensuring high code quality and reliability. Spectacles runs a test suite, but instead of testing production data, it tests data in the schema that reflects the output of your dbt™️ changes.

Prerequisites

Before proceeding, ensure the following prerequisites are met:

Paradime Turbo CI configuration

Turbo CI dbt commands
dbt clone --target ci
dbt build --select state:modified+ --target ci

LookML configuration

You will need to configure schema references in our LookML using a user attribute, allowing Spectacles to modify the sql_table_name in your views. This adjustment will direct Looker to query the Paradime Turbo CI-generated staging data instead of the production data.

Spectacles configuration

Set user attributes

For each user attribute you'd like to modify in Spectacles, you'll need to configure it with a default value on the Spectacles Settings page.

Here are the steps in Spectacles:

  1. In the top-right, click Settings ⚙️.

  2. Scroll down to the Looker Settings section.

  3. Under User attribute, input the name of the user attribute that controls the schema in your LookML.

  4. Under Default value, input the name of schema you use for production. For non-dbt Cloud runs, Spectacles will use this value.

  5. Click Update Settings.

Follow these steps to create a new Suite that triggers SQL validation whenever a pull request is opened and Paradime Turbo CI complete successfully.

  1. Navigate to Suites

    • In the top menu, click Suites.

    • Click New Suite.

  2. Create a Simple Suite

    • Name your Suite: Paradime PRs.

  3. Configure Triggers

    • Leave all the options unchecked as Paradime will trigger the Suite on completion of a Paradime Turbo CI.

    • In the Schema User Attribute field, select the user attribute created in Looker.

  4. Configure the SQL Validator

    • Explores to query: Leave the default setting unless you need to specify models using the model selection syntax.

    • Explores to exclude: Leave the default setting.

    • Fail fast: Select Yes.

    • Query concurrency: Leave the default setting.

    • Uncheck Is Enabled for both the Content Validator and the Assert Validator.

  5. Finalize the Suite

    • Click Create Suite.

Your new Suite is now set up to trigger SQL validation whenever a pull request is opened and Paradime Turbo CI completed successfully .

Connect Spectacles.dev to your Paradime workspace

What you need to configure the Spectacles integration?

  1. Get your Spectacles API key: You can generate or replace your API key in the Settings page for your Spectacles organisation.

Only one API key can be active at a time, so save your API key somewhere securely when generating it. If you lose it, you'll need to replace it with a new key.

  1. To retrieve the below information navigate to the suite you have created in the previews step that you want to trigger on the back of a Paradime Turbo CI run and copy the Url in you browser, it will look something like this:

https://app.spectacles.dev/org/pN9xfJfvcH6crAG0lyl4/proj/tYEXxrhFnkDJWv9gidAI/suites/ppRIWcoQLkcf6B5Ay78d/update

  • Get your Spectacles Org ID: This will be the first ID after org/ in your spectacles.dev url. In the above example, the org id is pN9xfJfvcH6crAG0lyl4.

  • Get your Spectacles Project ID: This will be the second ID after proj/ in your spectacles.dev url. In the above example, the projected id is tYEXxrhFnkDJWv9gidAI.

  • Get your Spectacles Suite ID: This will be the third ID after suites/ in your spectacles.dev url. In the above example, the Suite id is ppRIWcoQLkcf6B5Ay78d.

  1. Get the Paradime scheduler connection schema value: This value corresponds to the schema name field set in the connection used to execute Paradime Turbo CI.

Setup the Spectacles integration in your Paradime workspace

Use the predefined environment variable names as listed below:

Environment Variable Name
Environment Variable Value example

SPECTACLES_API_TOKEN

eyJhbGci0Mwyjdb5srBdDsAVOKdN5E

SPECTACLES_ORG_ID

pN9xfJfvcH6crAG0lyl4

SPECTACLES_PROJECT_ID

tYEXxrhFnkDJWv9gidAI

SPECTACLES_SUITE_ID

ppRIWcoQLkcf6B5Ay78d

SPECTACLES_CI_LOOKER_SCHEMA_NAME_KEY

dbt_schema

SPECTACLES_CI_LOOKER_SCHEMA_NAME_VALUE

ci

Active account

in your GitHub account with access to your dbt repository.

in your GitHub account with access to your dbt repository.

configured with target name: ci

in your Paradime workspace.

Generates Full dbt Project Output: The Turbo CI run must produce the complete output of your dbt project. If you are using the in your Turbo CI configuration, the initial step should clone your production schema into the target schema for each run See the example below:

details the steps required to make these changes to your LookML.

Want to know more about using user attributes with Spectacles? Find our guide to user attributes .

Set up a new Suite

Get the Looker Schema User Attribute name: This has been for your views in the sql_table_name. Based on the Spectacle.dev guide you might have called this dbt_schema.

To finalize the Spectacles integration in Paradime, you need to configure a set of within the Paradime Workspace.

For additional Looker User attributes, add more as: SPECTACLES_LOOKER_USER_ATTRIBUTE_<user_attribute_name> = <value>

📖
Spectacles.dev
Spectacles.dev GitHub app installed
Paradime GitHub app installed
Paradime Scheduler connection
Paradime Bolt Turbo CI configured
This guide
here
​
previously set in Looker
Bolt Schedules Environment Variables
Bolt Schedules Environment Variables
Turbo CI
modified state