BigQuery Multi-Project Service Account
Introduction
This guide walks through the process of configuring a Google Cloud service account to access multiple BigQuery projects. This setup is particularly useful for data engineering workflows that require querying or manipulating data across different projects within your organization's Google Cloud environment.
Prerequisites
Google Cloud account with administrative access
Multiple Google Cloud projects with BigQuery enabled
Step-by-Step Configuration
1. Create a Service Account
First, you'll need to create a service account in your primary Google Cloud project:
Navigate to the Google Cloud Console (https://console.cloud.google.com/)
Select your primary project from the project dropdown
Go to IAM & Admin > Service Accounts
Click Create Service Account
Enter the following details:
Service account name:
bq-multi-project-sa
(or your preferred name)Service account ID: This will auto-generate based on the name
Description: "Service account for accessing multiple BigQuery projects"
Click Create and Continue
2. Assign Roles in the Primary Project
Assign the necessary BigQuery roles to your service account in the primary project:
On the "Grant this service account access to project" screen:
Click Add Role and add the following roles:
BigQuery Data Editor
BigQuery Job User
Click Continue
Click Done to complete the service account creation
3. Create and Download the Service Account Key
Generate a key file for authentication:
From the Service Accounts list, click on your newly created service account
Navigate to the Keys tab
Click Add Key > Create new key
Select JSON as the key type
Click Create
The key file will automatically download to your computer
Store this key file securely as it grants access to your Google Cloud resources
4. Grant Access to Additional Projects
Now, you need to grant this service account access to your additional BigQuery projects:
Navigate to the Google Cloud Console
Select the second project where you want to grant access
Go to IAM & Admin > IAM
Click Grant Access
In the "New principals" field, enter the service account email (it should look like
bq-multi-project-sa@your-project-id.iam.gserviceaccount.com
)Click Add another role and add the following roles:
BigQuery Data Editor
BigQuery Job User
Click Save
Repeat steps 1-7 for each additional project that needs to be accessed
Setting Up dbt™ with BigQuery to Read and Write Across Different Projects
With our BigQuery service account now configured for multi-project access, we need to set up our dbt project to leverage these cross-project capabilities.
This configuration will enable our data transformation workflows to read source data from one BigQuery project and write the transformed results to another project, all while maintaining a clean, maintainable codebase.
What We'll Configure
In this section, we will:
Configure source definitions to explicitly reference external source projects
Establish proper model configurations to control where transformed data is written
1. Source Definitions with Explicit Project References
When working with data from different BigQuery projects, you must specify the source project ID in your source definitions.
The database
parameter in BigQuery corresponds to the project ID.
2. Control Where dbt™ Models are Written
The generate_database_name
macro is a core dbt functionality that determines which BigQuery project your models write to. dbt™ includes this macro by default, with the standard implementation works like this:
If you specify a database in your config, it uses that database
Falling back to the database from your active dbt™ target when none is specified
To specify which project and schema a model should be written to, use the config
block at the top of your model file.
Set Default Project level Destinations
For larger projects, set default destinations by model category in your dbt_project.yml
file.
Last updated
Was this helpful?