Paradime Help Docs
Get Started
  • 🚀Introduction
  • 📃Guides
    • Paradime 101
      • Getting Started with your Paradime Workspace
        • Creating a Workspace
        • Setting Up Data Warehouse Connections
        • Managing workspace configurations
        • Managing Users in the Workspace
      • Getting Started with the Paradime IDE
        • Setting Up a dbt™ Project
        • Creating a dbt™ Model
        • Data Exploration in the Code IDE
        • DinoAI: Accelerating Your Analytics Engineering Workflow
          • DinoAI Agent
            • Creating dbt Sources from Data Warehouse
            • Generating Base Models
            • Building Intermediate/Marts Models
            • Documentation Generation
            • Data Pipeline Configuration
            • Using .dinorules to Tailor Your AI Experience
          • Accelerating GitOps
          • Accelerating Data Governance
          • Accelerating dbt™ Development
        • Utilizing Advanced Developer Features
          • Visualize Data Lineage
          • Auto-generated Data Documentation
          • Enforce SQL and YAML Best Practices
          • Working with CSV Files
      • Managing dbt™ Schedules with Bolt
        • Creating Bolt Schedules
        • Understanding schedule types and triggers
        • Viewing Run History and Analytics
        • Setting Up Notifications
        • Debugging Failed Runs
    • Migrating from dbt™ cloud to Paradime
  • 🔍Concepts
    • Working with Git
      • Git Lite
      • Git Advanced
      • Read Only Branches
      • Delete Branches
      • Merge Conflicts
      • Configuring Signed Commits on Paradime with SSH Keys
      • GitHub Branch Protection Guide: Preventing Direct Commits to Main
    • dbt™ fundamentals
      • Getting started with dbt™
        • Introduction
        • Project Strucuture
        • Working with Sources
        • Testing Data Quality
        • Models and Transformations
      • Configuring your dbt™ Project
        • Setting up your dbt_project.yml
        • Defining Your Sources in sources.yml
        • Testing Source Freshness
        • Unit Testing
        • Working with Tags
        • Managing Seeds
        • Environment Management
        • Variables and Parameters
        • Macros
        • Custom Tests
        • Hooks & Operational Tasks
        • Packages
      • Model Materializations
        • Table Materialization
        • View​ Materialization
        • Incremental Materialization
          • Using Merge for Incremental Models
          • Using Delete+Insert for Incremental Models
          • Using Append for Incremental Models
          • Using Microbatch for Incremental Models
        • Ephemeral Materialization
        • Snapshots
      • Running dbt™
        • Mastering the dbt™ CLI
          • Commands
          • Methods
          • Selector Methods
          • Graph Operators
    • Paradime fundamentals
      • Global Search
        • Paradime Apps Navigation
        • Invite users to your workspace
        • Search and preview Bolt schedules status
      • Using --defer in Paradime
      • Workspaces and data mesh
    • Data Warehouse essentials
      • BigQuery Multi-Project Service Account
  • 📖Documentation
    • DinoAI
      • Agent Mode
        • Use Cases
          • Creating Sources from your Warehouse
          • Generating dbt™ models
          • Fixing Errors with Jira
          • Researching with Perplexity
          • Providing Additional Context Using PDFs
      • Context
        • File Context
        • Directory Context
      • Tools and Features
        • Warehouse Tool
        • File System Tool
        • PDF Tool
        • Jira Tool
        • Perplexity Tool
        • Terminal Tool
        • Coming Soon Tools...
      • .dinorules
      • Ask Mode
      • Version Control
      • Production Pipelines
      • Data Documentation
    • Code IDE
      • User interface
        • Autocompletion
        • Context Menu
        • Flexible layout
        • "Peek" and "Go To" Definition
        • IDE preferences
        • Shortcuts
      • Left Panel
        • DinoAI Coplot
        • Search, Find, and Replace
        • Git Lite
        • Bookmarks
      • Command Panel
        • Data Explorer
        • Lineage
        • Catalog
        • Lint
      • Terminal
        • Running dbt™
        • Paradime CLI
      • Additional Features
        • Scratchpad
    • Bolt
      • Creating Schedules
        • 1. Schedule Settings
        • 2. Command Settings
          • dbt™ Commands
          • Python Scripts
          • Elementary Commands
          • Lightdash Commands
          • Tableau Workbook Refresh
          • Power BI Dataset Refresh
          • Paradime Bolt Schedule Toggle Commands
          • Monte Carlo Commands
        • 3. Trigger Types
        • 4. Notification Settings
        • Templates
          • Run and Test all your dbt™ Models
          • Snapshot Source Data Freshness
          • Build and Test Models with New Source Data
          • Test Code Changes On Pull Requests
          • Re-executes the last dbt™ command from the point of failure
          • Deploy Code Changes On Merge
          • Create Jira Tickets
          • Trigger Census Syncs
          • Trigger Hex Projects
          • Create Linear Issues
          • Create New Relic Incidents
          • Create Azure DevOps Items
        • Schedules as Code
      • Managing Schedules
        • Schedule Configurations
        • Viewing Run Log History
        • Analyzing Individual Run Details
          • Configuring Source Freshness
      • Bolt API
      • Special Environment Variables
        • Audit environment variables
        • Runtime environment variables
      • Integrations
        • Reverse ETL
          • Hightouch
        • Orchestration
          • Airflow
          • Azure Data Factory (ADF)
      • CI/CD
        • Turbo CI
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
          • Paradime Turbo CI Schema Cleanup
        • Continuous Deployment with Bolt
          • GitHub Native Continuous Deployment
          • Using Azure Pipelines
          • Using BitBucket Pipelines
          • Using GitLab Pipelines
        • Column-Level Lineage Diff
          • dbt™ mesh
          • Looker
          • Tableau
          • Thoughtspot
    • Radar
      • Get Started
      • Cost Management
        • Snowflake Cost Optimization
        • Snowflake Cost Monitoring
        • BigQuery Cost Monitoring
      • dbt™ Monitoring
        • Schedules Dashboard
        • Models Dashboard
        • Sources Dashboard
        • Tests Dashboard
      • Team Efficiency Tracking
      • Real-time Alerting
      • Looker Monitoring
    • Data Catalog
      • Data Assets
        • Looker assets
        • Tableau assets
        • Power BI assets
        • Sigma assets
        • ThoughtSpot assets
        • Fivetran assets
        • dbt™️ assets
      • Lineage
        • Search and Discovery
        • Filters and Nodes interaction
        • Nodes navigation
        • Canvas interactions
        • Compare Lineage version
    • Integrations
      • Dashboards
        • Sigma
        • ThoughtSpot (Beta)
        • Lightdash
        • Tableau
        • Looker
        • Power BI
        • Streamlit
      • Code IDE
        • Cube CLI
        • dbt™️ generator
        • Prettier
        • Harlequin
        • SQLFluff
        • Rainbow CSV
        • Mermaid
          • Architecture Diagrams
          • Block Diagrams Documentation
          • Class Diagrams
          • Entity Relationship Diagrams
          • Gantt Diagrams
          • GitGraph Diagrams
          • Mindmaps
          • Pie Chart Diagrams
          • Quadrant Charts
          • Requirement Diagrams
          • Sankey Diagrams
          • Sequence Diagrams
          • State Diagrams
          • Timeline Diagrams
          • User Journey Diagrams
          • XY Chart
          • ZenUML
        • pre-commit
          • Paradime Setup and Configuration
          • dbt™️-checkpoint hooks
            • dbt™️ Model checks
            • dbt™️ Script checks
            • dbt™️ Source checks
            • dbt™️ Macro checks
            • dbt™️ Modifiers
            • dbt™️ commands
            • dbt™️ checks
          • SQLFluff hooks
          • Prettier hooks
      • Observability
        • Elementary Data
          • Anomaly Detection Tests
            • Anomaly tests parameters
            • Volume anomalies
            • Freshness anomalies
            • Event freshness anomalies
            • Dimension anomalies
            • All columns anomalies
            • Column anomalies
          • Schema Tests
            • Schema changes
            • Schema changes from baseline
          • Sending alerts
            • Slack alerts
            • Microsoft Teams alerts
            • Alerts Configuration and Customization
          • Generate observability report
          • CLI commands and usage
        • Monte Carlo
      • Storage
        • Amazon S3
        • Snowflake Storage
      • Reverse ETL
        • Hightouch
      • CI/CD
        • GitHub
        • Spectacles
      • Notifications
        • Microsoft Teams
        • Slack
      • ETL
        • Fivetran
    • Security
      • Single Sign On (SSO)
        • Okta SSO
        • Azure AD SSO
        • Google SAML SSO
        • Google Workspace SSO
        • JumpCloud SSO
      • Audit Logs
      • Security model
      • Privacy model
      • FAQs
      • Trust Center
      • Security
    • Settings
      • Workspaces
      • Git Repositories
        • Importing a repository
          • Azure DevOps
          • BitBucket
          • GitHub
          • GitLab
        • Update connected git repository
      • Connections
        • Code IDE environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Scheduler environment
          • Amazon Athena
          • BigQuery
          • Clickhouse
          • Databricks
          • Dremio
          • DuckDB
          • Firebolt
          • Microsoft Fabric
          • Microsoft SQL Server
          • MotherDuck
          • PostgreSQL
          • Redshift
          • Snowflake
          • Starburst/Trino
        • Manage connections
          • Set alternative default connection
          • Delete connections
        • Cost connection
          • BigQuery cost connection
          • Snowflake cost connection
        • Connection Security
          • AWS PrivateLink
            • Snowflake PrivateLink
            • Redshift PrivateLink
          • BigQuery OAuth
          • Snowflake OAuth
        • Optional connection attributes
      • Notifications
      • dbt™
        • Upgrade dbt Core™ version
      • Users
        • Invite users
        • Manage Users
        • Enable Auto-join
        • Users and licences
        • Default Roles and Permissions
        • Role-based access control
      • Environment Variables
        • Bolt Schedules Environment Variables
        • Code IDE Environment Variables
  • 💻Developers
    • GraphQL API
      • Authentication
      • Examples
        • Audit Logs API
        • Bolt API
        • User Management API
        • Workspace Management API
    • Python SDK
      • Getting Started
      • Modules
        • Audit Log
        • Bolt
        • Lineage Diff
        • Custom Integration
        • User Management
        • Workspace Management
    • Paradime CLI
      • Getting Started
      • Bolt CLI
    • Webhooks
      • Getting Started
      • Custom Webhook Guides
        • Create an Azure DevOps Work item when a Bolt run complete with errors
        • Create a Linear Issue when a Bolt run complete with errors
        • Create a Jira Issue when a Bolt run complete with errors
        • Trigger a Slack notification when a Bolt run is overrunning
    • Virtual Environments
      • Using Poetry
      • Troubleshooting
    • API Keys
    • IP Restrictions in Paradime
    • Company & Workspace token
  • 🙌Best Practices
    • Data Mesh Setup
      • Configure Project dependencies
      • Model access
      • Model groups
  • ‼️Troubleshooting
    • Errors
    • Error List
    • Restart Code IDE
  • 🔗Other Links
    • Terms of Service
    • Privacy Policy
    • Paradime Blog
Powered by GitBook
On this page
  • What Are dbt Packages?
  • Adding Packages to Your Project
  • Package Installation Methods
  • Using Package Functionality
  • Configuring Packages
  • Popular dbt Packages
  • Working with Private Packages
  • Package Maintenance

Was this helpful?

  1. Concepts
  2. dbt™ fundamentals
  3. Configuring your dbt™ Project

Packages

dbt packages allow you to import pre-built models, macros, and tests into your project, helping you solve common data modeling challenges without reinventing the wheel. This guide explains how to use, install, and create dbt packages.

What Are dbt Packages?

dbt packages are essentially standalone dbt projects that can be imported into your project. They contain reusable models, macros, tests, and other resources that extend dbt's functionality and help solve common data modeling challenges.

Packages enable you to:

  • Leverage community-contributed solutions

  • Standardize transformations across projects

  • Import specialized functionality for specific data sources

  • Apply consistent testing patterns

  • Avoid reinventing solutions for common problems


Adding Packages to Your Project

Using packages in your dbt project is a simple three-step process:

  1. Create a packages.yml file in your project root (next to your dbt_project.yml)

  2. Define the packages you want to use

  3. Run dbt deps to install the packages

Basic Package Configuration

# packages.yml
packages:
  - package: dbt-labs/dbt_utils
    version: 1.1.1
  
  - package: calogica/dbt_expectations
    version: 0.8.5

When you run dbt deps, dbt will install these packages into a dbt_packages/ directory in your project. By default, this directory is ignored by git to avoid duplicating code.


Package Installation Methods

dbt supports several methods for specifying package sources, depending on where your package is stored.

Hub Packages (Recommended)

The simplest way to install packages is from the dbt Hub:

packages:
  - package: dbt-labs/snowplow
    version: 0.7.3

You can also specify version ranges using semantic versioning:

packages:
  - package: dbt-labs/snowplow
    version: [">=0.7.0", "<0.8.0"]

This approach is recommended because the Hub can handle duplicate dependencies automatically.

Git Packages

For packages stored in Git repositories:

packages:
  - git: "https://github.com/dbt-labs/dbt-utils.git"
    revision: 0.9.2

The revision parameter can be:

  • A branch name

  • A tag name

  • A specific commit (40-character hash)

Local Packages

For packages on your local filesystem:

packages:
  - local: relative/path/to/package

This is useful for testing package changes or working with monorepos.

Package Versioning Best Practices

  • Always pin package versions in production projects

  • Use semantic versioning ranges for minor updates

  • Test package updates thoroughly before deploying to production

  • Beginning with dbt v1.7, running dbt deps automatically pins packages by creating a package-lock.yml file


Using Package Functionality

Once installed, you can use the resources from packages in your project.

Using Package Macros

Call macros from the package in your models:

-- Using dbt_utils.generate_surrogate_key
SELECT
    {{ dbt_utils.generate_surrogate_key(['customer_id', 'order_date']) }} as order_sk,
    customer_id,
    order_date,
    amount
FROM {{ ref('stg_orders') }}

Using Package Tests

Apply tests provided by packages in your schema files:

models:
  - name: customers
    columns:
      - name: email
        tests:
          - dbt_expectations.expect_column_values_to_match_regex:
              regex: '^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'

Referencing Package Models

Reference models from packages using the standard ref function:

-- Reference a model from a package
SELECT * FROM {{ ref('snowplow', 'snowplow_page_views') }}

When referencing models from packages, you can include the package name as the first argument to ref.


Configuring Packages

Many packages allow you to configure their behavior using variables in your dbt_project.yml file:

# dbt_project.yml

vars:
  # Configure the snowplow package
  snowplow:
    'snowplow:timezone': 'America/New_York'
    'snowplow:page_ping_frequency': 10
    'snowplow:events': "{{ ref('sp_base_events') }}"

# Override package configurations
models:
  snowplow:
    +schema: snowplow_models

You can also override materializations, schemas, or other configurations defined in the package.


Popular dbt Packages

Here are some widely-used packages that can enhance your dbt projects:

Package
Purpose
Key Features

dbt-utils

General utilities

Cross-database macros, SQL helpers, schema tests

dbt-expectations

Data quality testing

Advanced testing functions inspired by Great Expectations

dbt-date

Date/time functionality

Date spine generation, fiscal periods, holiday calendars

dbt-audit-helper

Auditing and comparison

Model comparison, reconciliation helpers

codegen

Code generation

Auto-generate source definitions and base models

dbt-meta-testing

Document and test coverage

Test your documentation and test coverage


Working with Private Packages

For organizations with internal packages, dbt supports several methods for authentication.

Private Hub Packages

You can use private packages with the proper authentication:

packages:
  - private: dbt-labs/internal-package
    provider: "github"  # Specify if you have multiple git providers configured

Git Token Method

For HTTPS authentication with a token:

packages:
  - git: "https://{{env_var('GIT_CREDENTIAL')}}@github.com/dbt-labs/internal-package.git"

Environment Variables

When using environment variables with dbt, ensure they're available in your execution environment. You can set these as environment variables in your operating system or in your CI/CD pipeline.

SSH Key Method (Command Line)

For command-line users with SSH authentication:

packages:
  - git: "git@github.com:dbt-labs/internal-package.git"

Package Maintenance

Updating Packages

To update packages:

  1. Change the version/revision in packages.yml

  2. Run dbt deps to install the updated packages

  3. Test the changes thoroughly before deploying

Uninstalling Packages

To remove a package:

  1. Delete it from your packages.yml file

  2. Run dbt clean to remove the installed package

  3. Run dbt deps to reinstall remaining packages


Advanced Package Techniques

Handling Package Conflicts

When using multiple packages, you might encounter naming conflicts. You can resolve these by:

  1. Using fully-qualified references:

    {{ dbt_utils.generate_surrogate_key(['id']) }}
  2. Overriding package macros in your project:

    {% macro generate_surrogate_key(field_list) %}
        {# Your custom implementation #}
    {% endmacro %}

Subdirectory Configuration

For packages nested in subdirectories (e.g., in monorepos):

packages:
  - git: "https://github.com/dbt-labs/dbt-labs-experimental-features"
    subdirectory: "materialized-views"
PreviousHooks & Operational TasksNextModel Materializations

Last updated 2 months ago

Was this helpful?

🔍