Snowflake Storage
Paradime enable you to read from the Paradime hosted AWS S3 bucket all the files generated from production schedules running with paradime using Snowflake Storage Integration.
To setup this integration reach out to the Paradime team to get:
STORAGE_AWS_ROLE_ARN
STORAGE_ALLOWED_LOCATIONS
You will need these to create the storage integration in your Snowflake account.
1. Create a Snowflake Storage Integration
After receiving the AWS role and S3 path from the Paradime team, you will need to create a new storage integration to establish connecting your Snowflake account to the AWS S3 bucket containing your schedules' run metadata.\
Execute the below SQL command in Snowflake replacing the placeholder text with the appropriate values.
Now you will need to the get from Snowflake the STORAGE_AWS_IAM_USER_ARN
and the STORAGE_AWS_EXTERNAL_ID
for the Storage Integration you have just created by executing the below command and share it with the Paradime team to complete the configuration.
2. Create a Snowflake External Stage to query our metadata
You can now create a Snowflake External Stage to specify where data files are stored so that the data in the files can be loaded into a table.
First lets create a file format in Snowflake. This will be needed in the next step when creating the STAGE
In the example below we are going to create an external stage based on our storage integration in the database called ANALYTICS
and in the schema EXTERNAL
.
Note that you can create multiple Snowflake Stages connecting to your Snowflake Storage integration.
Below we are going to create a STAGE to access schedules metadata like manifest.json
for any of the schedules in our default Paradime workspace (This usually is named after the organization name you enter to login/signup).
Alternatively you can create a STAGE point directly to a given schedule name folder in the AWS S3
Query dbt™️ runs artifacts directly from Snowflake
Now that the Snowflake Stage has been configured, you can query your Bolt schedules metadata directly from Snowflake with a query like the one below.
👉 See also: Querying Snowflake Stage.
Last updated