Generate a password for the Paradime user and replace the placeholder text <your_generated_password_here> in the step below.
-- 1.### create user and roleuserole useradmin;createuserifnotexists paradime_cost_userpassword='<your_generated_password_here>' default_role = paradime_cost_role default_warehouse = paradime_warehouse comment ='Used by paradime.io for cost analytics';createroleifnotexists paradime_cost_role comment ='Used by paradime.io for cost analytics';grantrole paradime_cost_role torole sysadmin;grantrole paradime_cost_role to user paradime_cost_user;
Before running this command, you will need to update STORAGE_AWS_ROLE_ARN and STORAGE_ALLOWED_LOCATIONS with your own region and company token.
You can find the region and company token in the Paradime workspace settings page.
-- 3.### Create S3 integrationuserole accountadmin;createorreplace storage integration PARADIME_S3_STORAGE_INTEGRATIONtype= external_stage storage_provider ='S3'enabled= true STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::817727247863:role/paradime-snowflake-role-<your_paradime_region>-<your_company_token>' -- replace company token and region before running this command
STORAGE_ALLOWED_LOCATIONS = ('s3://paradime-s3-<your_company_token>-<your_paradime_region>/snowflake/'); -- replace company token and region before running this command
grant usage on integration PARADIME_S3_STORAGE_INTEGRATION torole paradime_cost_role;
Create a new database and schema for Paradime
-- 4. ### create paradime db and schemas and grant accessuserole accountadmin;create or replacedatabasePARADIME_COST_ANALYTICS;create or replaceschemaPARADIME_COST_ANALYTICS.CORE;grant usage ondatabase PARADIME_COST_ANALYTICS torole paradime_cost_role;grant usage onschema PARADIME_COST_ANALYTICS.CORE torole paradime_cost_role;
Create a new Snowflake file format
-- 5.### Create fileformat and grant privilegesuserole accountadmin;createorreplacefile format PARADIME_COST_ANALYTICS.CORE.PARADIME_JSON_FORMATtype=JSON NULL_IF = ();grant all privileges onfile format PARADIME_COST_ANALYTICS.CORE.PARADIME_JSON_FORMAT to paradime_cost_role;
Grant access to Snowflake metadata to the Paradime user
-- 6.### grant access to Snowflake metadatauserole accountadmin;grant imported privileges ondatabase snowflake torole paradime_cost_role;
Enable the Paradime user to manage and monitor Snowflake Warehouses
-- 7.### grant access to monitor warehousesuserole accountadmin;grant monitor usage on account torole paradime_cost_role;--- ### grant paradime the access to manage warehousesuserole accountadmin;grant manage warehouses on account torole paradime_cost_role;
Full script here 👇
Snowflake setup guide script
-- 1.### create user and roleuserole useradmin;createuserifnotexists paradime_cost_userpassword='<you_generated_password_here>' default_role = paradime_cost_role default_warehouse = paradime_warehouse comment ='Used by paradime.io for cost analytics';createroleifnotexists paradime_cost_role comment ='Used by paradime.io for cost analytics';grantrole paradime_cost_role torole sysadmin;grantrole paradime_cost_role to user paradime_cost_user;-- 2.### create warehouseuserole sysadmin;create warehouse ifnotexists paradime_warehouse warehouse_size=xsmall auto_suspend=60 initially_suspended=true comment ='Used by paradime.io for cost analytics';grant monitor, operate, usage, modifyon warehouse paradime_warehouse torole paradime_cost_role;-- 3.### Create S3 integrationuserole accountadmin;createorreplace storage integration PARADIME_S3_STORAGE_INTEGRATIONtype= external_stage storage_provider ='S3'enabled= true STORAGE_AWS_ROLE_ARN = 'arn:aws:iam::817727247863:role/paradime-snowflake-role-<your_paradime_region>-<your_company_token>' -- replace company token and region before running this command
STORAGE_ALLOWED_LOCATIONS = ('s3://paradime-s3-<your_company_token>-<your_paradime_region>/snowflake/'); -- replace company token and region before running this command
grant usage on integration PARADIME_S3_STORAGE_INTEGRATION torole paradime_cost_role;-- 4. ### create paradime db and schemas and grant accessuserole accountadmin;create or replacedatabasePARADIME_COST_ANALYTICS;create or replaceschemaPARADIME_COST_ANALYTICS.CORE;grant usage ondatabase PARADIME_COST_ANALYTICS torole paradime_cost_role;grant usage onschema PARADIME_COST_ANALYTICS.CORE torole paradime_cost_role;-- 5.### Create fileformat and grant privilegesuserole accountadmin;createorreplacefile format PARADIME_COST_ANALYTICS.CORE.PARADIME_JSON_FORMATtype=JSON NULL_IF = ();grant all privileges onfile format PARADIME_COST_ANALYTICS.CORE.PARADIME_JSON_FORMAT to paradime_cost_role;-- 6.### grant access to Snowflake metadatauserole accountadmin;grant imported privileges ondatabase snowflake torole paradime_cost_role;-- 7.### grant access to monitor warehousesuserole accountadmin;grant monitor usage on account torole paradime_cost_role;--- ### grant paradime the access to manage warehousesuserole accountadmin;grant manage warehouses on account torole paradime_cost_role;
2. dbt™️ project setup guide
Create a macro to manage query comments
To enable Paradime to enrich your Snowflake queries with additional metadata you will need to create a new dbt™️ macro called get_query_comment.sql in the macros folder of your project.
get_query_comment.sql dbt™️ macro
get_query_comment.sql
{% macro get_query_comment(node, extra = {}) %} {%-set comment_dict = extra -%} {%- do comment_dict.update( app='dbt', dbt_version=dbt_version, project_name=project_name, target_name=target.name, target_database=target.database, target_schema=target.schema, invocation_id=invocation_id ) -%} {%-if node isnotnone-%} {%- do comment_dict.update( node_name=node.name, node_alias=node.alias, node_package_name=node.package_name, node_original_file_path=node.original_file_path, node_database=node.database, node_schema=node.schema, node_id=node.unique_id, node_resource_type=node.resource_type, node_meta=node.config.meta, node_tags=node.tags, full_refresh=flags.FULL_REFRESH, which=flags.WHICH, ) -%} {%-if flags.INVOCATION_COMMAND -%} {%- do comment_dict.update( invocation_command=flags.INVOCATION_COMMAND ) -%} {%- endif -%} {%- if node.resource_type != ('seed') -%} {# Otherwise this throws an error saying 'Seeds cannot depend on other nodes.' #}
{%-if node.refs is defined -%} {% set refs = [] %} {% for ref in node.refs %} {%-if dbt_version >='1.5.0'-%} {%- do refs.append(ref.name) -%} {%-else-%} {%- do refs.append(ref[0]) -%} {%- endif -%} {% endfor %} {%- do comment_dict.update( node_refs=refs | unique | list ) -%} {%- endif -%} {%- endif -%} {%-if node.resource_type =='model'-%} {%- do comment_dict.update( materialized=node.config.materialized, ) -%} {%- endif -%} {%- endif -%}--- only if using dbt cloud {%-if env_var('DBT_CLOUD_PROJECT_ID', False) -%} {%- do comment_dict.update( dbt_cloud_project_id=env_var('DBT_CLOUD_PROJECT_ID') ) -%} {%- endif -%} {%-if env_var('DBT_CLOUD_JOB_ID', False) -%} {%- do comment_dict.update( dbt_cloud_job_id=env_var('DBT_CLOUD_JOB_ID') ) -%} {%- endif -%} {%-if env_var('DBT_CLOUD_RUN_ID', False) -%} {%- do comment_dict.update( dbt_cloud_run_id=env_var('DBT_CLOUD_RUN_ID') ) -%} {%- endif -%} {%-if env_var('DBT_CLOUD_RUN_REASON_CATEGORY', False) -%} {%- do comment_dict.update( dbt_cloud_run_reason_category=env_var('DBT_CLOUD_RUN_REASON_CATEGORY') ) -%} {%- endif -%} {%-if env_var('DBT_CLOUD_RUN_REASON', False) -%} {%- do comment_dict.update( dbt_cloud_run_reason=env_var('DBT_CLOUD_RUN_REASON') ) -%} {%- endif -%}-- only if using bolt scheduler {%-if env_var('PARADIME_SCHEDULE_NAME', False) -%} {%- do comment_dict.update( paradime_schedule_name=env_var('PARADIME_SCHEDULE_NAME') ) -%} {%- endif -%} {%-if env_var('PARADIME_SCHEDULE_RUN_ID', False) -%} {%- do comment_dict.update( paradime_schedule_run_id=env_var('PARADIME_SCHEDULE_RUN_ID') ) -%} {%- endif -%} {%-if env_var('PARADIME_SCHEDULE_RUN_START_DTTM', False) -%} {%- do comment_dict.update( paradime_schedule_run_start_dttm=env_var('PARADIME_SCHEDULE_RUN_START_DTTM') ) -%} {%- endif -%} {%-if env_var('PARADIME_SCHEDULE_TRIGGER', False) -%} {%- do comment_dict.update( paradime_schedule_trigger=env_var('PARADIME_SCHEDULE_TRIGGER') ) -%} {%- endif -%} {%-if env_var('PARADIME_SCHEDULE_GIT_SHA', False) -%} {%- do comment_dict.update( paradime_schedule_git_sha=env_var('PARADIME_SCHEDULE_GIT_SHA') ) -%} {%- endif -%} {{ return(tojson(comment_dict)) }}{% endmacro %}
Update your dbt_project.yml file
This step ensures that with each dbt™️ run, the query comment is appended to the query running in Snowflake.