Custom Tests

While dbt comes with built-in tests, custom tests allow you to implement specific data quality checks tailored to your business logic. This guide will help you understand how to create and use custom tests to ensure the quality and reliability of your data transformations.

Types of Custom Tests

dbt supports two main types of custom tests:

Test Type
Description
Where Defined
Scope

Singular Tests

SQL files returning failing records

tests/ directory

Specific use case

Generic Tests

Reusable test definitions applicable to different models

macros/ directory

Reusable


Singular Tests

Singular tests are SQL queries that should return zero rows when the test passes:

-- tests/assert_total_payment_amount_matches_order_amount.sql

SELECT
    order_id,
    order_amount,
    payment_amount,
    ABS(order_amount - payment_amount) as amount_diff
FROM {{ ref('orders') }} o
LEFT JOIN {{ ref('payments') }} p USING (order_id)
WHERE ABS(order_amount - payment_amount) > 0.01

This test identifies orders where the payment amount doesn't match the order amount within a small tolerance.

To run singular tests:

Organizing Singular Tests

For larger projects, you might want to organize singular tests by domain or purpose:


Creating Generic Custom Tests

Generic tests are more powerful because they can be applied to different models throughout your project. They are defined as macros with a special syntax:

Example: Positive Values Test

Here's a simple custom test that checks if values in a column are positive:

Using Custom Tests in YAML Files

Once defined, you can reference these custom tests in your schema YAML files just like built-in tests:

Parameterizing Custom Tests

You can make your custom tests more flexible by adding parameters:

In your YAML file:


Combining Macros and Tests

You can use macros within your custom tests to create powerful, reusable testing frameworks:

This approach lets you centralize business rules (like valid status values) and reuse them across tests.


Advanced Test Configurations

You can configure how tests behave using additional properties:

Configuration
Description
Example Use Case

severity

Can be 'error' (default) or 'warn'

For tests that shouldn't block production runs

store_failures

When true, stores test failures in a table

For troubleshooting or monitoring over time

limit

Maximum number of failing records to return

For large tables where full results aren't needed

where

Apply additional filtering to the test query

For focusing tests on specific data subsets

enabled

Boolean that can conditionally disable the test

For environment-specific test configuration


Testing Data Quality with Packages

Popular packages like dbt-expectations extend dbt's testing capabilities with advanced data validation:

Example: Advanced Data Validation

Using dbt-expectations to implement sophisticated data quality checks:

Popular Testing Packages

Package
Purpose
Key Features

dbt-expectations

Data quality tests inspired by Great Expectations

Advanced schema validation, statistical tests

dbt-audit-helper

Compare query results between models

Model comparison, reconciliation tests

dbt-utils

Common test utilities

Equal rowcounts, relationships, cardinality checks

elementary

Anomaly detection and data validation

Historical test comparisons, metrics monitoring


Real-World Custom Test Examples

Date Range Validation

Numeric Distribution Test

Referential Integrity with Exceptions


Best Practices for Custom Tests

Best Practice
Description

Test Critical Data First

Focus on testing key business metrics and join keys. Identify high-risk areas for data quality issues.

Make Tests Descriptive

Name tests clearly to indicate what they verify. Add comments explaining the purpose and expectations.

Balance Coverage and Performance

Consider the runtime impact of extensive testing. Use selective testing for large tables.

Group Related Tests

Organize tests that verify related business rules. Use consistent naming conventions.

Handle Edge Cases

Test boundary conditions. Consider null handling and empty tables.

Monitor Test Results

Track test failures over time. Establish alerting for critical test failures.

Pro Tip: Troubleshooting Failed Tests

When tests fail, use these strategies to diagnose issues:

  1. Check error messages in the dbt logs

  2. Examine a sample of failing records with store_failures: true

  3. Verify test logic by inspecting the compiled SQL in target/compiled/

  4. Use --vars to test with different parameters

By implementing custom tests, you can ensure your transformations meet business requirements and maintain high data quality standards throughout your dbt project.

Last updated

Was this helpful?