BigQuery cost connection
Last updated
Was this helpful?
Last updated
Was this helpful?
IP RESTRICTIONS
Make sure to allow traffic from one of the Paradime IPs in your firewall depending on the data location selected.
👉 See also: .
Follow these steps to create a new service account user for Paradime and grant the required permissions at the project level:
Go to Google Cloud Console and:
Select your project from the project selector at the top.
Go to the "IAM & Admin" section.
Create Service Account:
Click on "Service Accounts".
Click "Create Service Account".
Enter the Service Account name (e.g., paradime-radar-user
).
Click "Create and Continue".
Assign Roles:
Assign the necessary roles to the service account:
BigQuery Resource Viewer
BigQuery Job User
Complete and Save:
Click "Done" after assigning the roles.
The service account will now be created with the assigned permissions.
Generate Key:
Go to the "Keys" tab and click "Add Key" -> "Create New Key".
Choose JSON format and save the key file securely. (We will need this later to connect BigQuery to Paradime.)
Make sure to name the BigQuery Dataset as paradime_cost_analytics
To set up your BigQuery Project for Paradime, follow these steps:
Create a Dataset:
Name: paradime_cost_analytics
Select the appropriate Dataset region (this will be needed for the Paradime connection setup).
Grant Access:
Assign the BigQuery User
and BigQuery Data Editor
roles to the service account user created previously, ensuring these permissions are at the Dataset level.
To create a log sink in Google Cloud, you need the "Logs Configuration Writer" (roles/logging.configWriter) IAM role on the project where you're creating the sink.
Make sure to use paradime_cost_analytics
as the destination dataset for the Logs Routing Sink.
To set up your Logs Routing Sink for Paradime, follow these steps:
Select the Google Cloud project in which the log entries that you want to route originate.
Select Create sink.
In the Sink details panel, enter the following details:
Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and create a new sink.
Sink description (optional): Describe the purpose or use case for the sink.
In the Sink destination panel, select the sink service and destination by using the Select sink service menu.
Select BigQuery dataset: Select the data set name paradime_cost_analytics
(created in the previous step) to receive the routed log entries.
Select the option ot partition tables
Go to the Choose logs to include in sink panel and specify the log entries to include:
Select Create sink.
After the Logs Router Sink table has been created, you can setup an expiration for partitions older than 7 days.
Leave the rest of the Bucket configuration using the default options unless required by your organization. Ensure the GCS Bucket region is the same as the Dataset region.
Navigate to Cloud Storage and select Buckets
Click "Create".
Select your Bucket region (e.g., US
).
Click on "Create" to complete creating your GCS Bucket.
Set lifecycle policy
Go to the "Lifecycle" tab for your GCS Bucket.
Select "Add Rule" -> "Delete Object" action.
Set "Age" to 2 days and click "Create" to set the policy.
Grant the Paradime service account user access to the GCS Bucket
Go to the "Permissions" tab for your GCS Bucket.
Add the Paradime Service Account user email as a "New Principal" and assign the Storage Admin
role.
To enable Paradime to enrich your BigQuery queries with additional metadata you will need to create a new dbt™️ macro called get_query_comment.sql
in the macros folder of your project.
Create the Macro (get_query_comment.sql
)
Update your dbt_project.yml file
This step ensures that with each dbt™️ run, the query comment is appended to the query running in BigQuery.
To connect Paradime to BigQuery for cost tracking, follow these steps:
Click "Settings" in the top menu bar of the Paradime interface.
Click "Connections" in the left sidebar.
Click "Add New" next to the Radar Environment.
Select "BigQuery" as the connection type.
Once you have selected BigQuery, provide the following details to complete the setup:
BigQuery Service Account JSON
BigQuery Project IDs
Dataset Location for the dataset used by Paradime
GCS Bucket Name used by Paradime
Go to Google Cloud Console, and select your project.
In the Google Cloud console, go to the . If you use the search bar to find this page, then select the result whose subheading is Logging.
Using the existing service account you from step 1, , complete the following:
Go to Google Cloud Console and .
Similar to step 2, , complete the following:
Go to Google Cloud Console, , and select your project.
Similar to step 3, , complete the same process again, for each BigQuery Project to route Logs in their respective BigQuery Dataset in Each Project called paradime_cost_analytics
Name your (e.g., paradime-bucket
).