# Amazon S3

Paradime supports using AWS S3 custom buckets in the customer's VPC to store logs and run artifacts generated in each dbt™️ command when running schedules using Bolt.

{% content-ref url="../../bolt" %}
[bolt](https://docs.paradime.io/app-help/documentation/bolt)
{% endcontent-ref %}

### 1. Create your own AWS S3 bucket <a href="#id-1-create-your-own-aws-s3-bucket" id="id-1-create-your-own-aws-s3-bucket"></a>

In your AWS account navigate to [Amazon S3 > Buckets > Create Bucket.](https://s3.console.aws.amazon.com/s3/bucket/create?region=us-east-1) Enter the custom bucket name and select the region where you want to create your bucket in.

{% hint style="warning" %}
The bucket region should be the same as the region where your Paradime instance is hosted. You can find this in [workspace settings](https://docs.paradime.io/app-help/documentation/settings/dbt/upgrade-dbt-core-version).
{% endhint %}

### 2. Create IAM policy <a href="#id-2-create-iam-policy" id="id-2-create-iam-policy"></a>

Next [create the IAM policy](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create-console.html) that provides the minimum required permissions for Paradime to access the Amazon S3 bucket.

Sign in to your AWS account and open the [IAM Management Console](https://console.aws.amazon.com/iam/) and in the navigation pane, choose **Policies**, then **Create policy**.

<figure><img src="https://2337193041-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHET0AD04uHMgdeLAjptq%2Fuploads%2FvdLaSx8RRRpx5znnf9s5%2Fimage.png?alt=media&#x26;token=37466d92-5c71-4c00-80cd-f13403e1b3b2" alt=""><figcaption></figcaption></figure>

Select the JSON tab and copy the permissions JSON as shown below to create a new IAM policy.

```json
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:GetObjectAcl",
                "s3:ListMultipartUploadParts",
                "s3:PutObject",
                "s3:PutObjectAcl",
                "s3:DeleteObject"
            ],
            "Resource": "arn:aws:s3:::<your-bucket-name>/*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": "arn:aws:s3:::<your-bucket-name>"
        }
    ]
}
```

### 3. Create AWS User and attach the previously created IAM policy <a href="#id-3-create-iam-user" id="id-3-create-iam-user"></a>

* Next [create a new user](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_users_create.html) in AWS

<figure><img src="https://2337193041-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHET0AD04uHMgdeLAjptq%2Fuploads%2FVKwj2Zqk2obEvI8UsJoQ%2Fimage.png?alt=media&#x26;token=15ab7674-654e-4cb7-a5f0-bc438c3764a5" alt=""><figcaption></figcaption></figure>

* &#x20;Assign the IAM policy created in the previous step to this user.

<figure><img src="https://2337193041-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHET0AD04uHMgdeLAjptq%2Fuploads%2FCScJDOgqtmWyvLEFIAYP%2Fimage.png?alt=media&#x26;token=e27360cf-7676-40b2-95ed-6322fcb46092" alt=""><figcaption></figcaption></figure>

### 4. Configure CORS policy for the S3 bucket <a href="#id-4-send-the-details-to-the-paradime-team" id="id-4-send-the-details-to-the-paradime-team"></a>

Set up the CORS policy to enable Paradime to access files when the request is initiated from the Paradime app.

* In AWS navigate to S3 Bucket and select the Paradime S3 bucket created in the previous steps.
* Select Permissions and then Cross-origin resource sharing (CORS)
* Provide the below configurations

```json
[
    {
        "AllowedHeaders": [],
        "AllowedMethods": [
            "GET"
        ],
        "AllowedOrigins": [
            "*.paradime.io"
        ],
        "ExposeHeaders": []
    }
]
```

### 5.  Create Access Key ID and Secret Access Key <a href="#id-4-send-the-details-to-the-paradime-team" id="id-4-send-the-details-to-the-paradime-team"></a>

To generate the credentials needed for Paradime integration:

1. Navigate to **IAM > Users** and select the user you created in the previous step
2. Go to the **Security credentials** tab
3. Click **Create Access key**

<figure><img src="https://2337193041-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHET0AD04uHMgdeLAjptq%2Fuploads%2FoTQNdWO1B4HTVeGKjbhw%2Fimage.png?alt=media&#x26;token=2c47713a-5c3c-4605-ae7f-6032bd590fef" alt=""><figcaption></figcaption></figure>

4. On the "Access key best practices & alternatives" page:

* Select **Third-party service** as your use case since Paradime will be accessing your S3 bucket
* AWS will display a warning recommending temporary security credentials (IAM roles) instead of long-term access keys
* Check the confirmation box "I understand the above recommendation and want to proceed to create an access key"
* Click **Next**

5. Optionally add a description tag for the access key
6. Click **Create Access key**&#x20;

{% hint style="info" %}
**Important:** Copy and securely store both the **Access Key** and **Secret Access Key** - the secret key will not be shown again
{% endhint %}

<figure><img src="https://2337193041-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FHET0AD04uHMgdeLAjptq%2Fuploads%2F0qBAF6vJKNHNrnL0EjK0%2Fimage.png?alt=media&#x26;token=27eaa0c5-0a60-4918-80df-2d52c624e175" alt=""><figcaption></figcaption></figure>

### 6. Send the details to the Paradime team <a href="#id-4-send-the-details-to-the-paradime-team" id="id-4-send-the-details-to-the-paradime-team"></a>

Once the custom S3 bucket and roles are created, reach out to the Paradime team on <support@paradime.io> and securely share the details below via a password manager like Dashlane / 1Password.

* **Bucket Name**
* **Bucket region**
* **Access Key**
* **Secret Access Key**

Once the integration is setup we will create a dummy file `paradime-empty-test-file` in the customer's S3 bucket to test that Paradime can connect and write to it.

From this point onwards, dbt™️ run logs and artifacts will be stored in the custom S3 bucket.

## FAQ: How the dbt™️ artifacts are stored in the S3 buckets? <a href="#faq-how-the-dbt-artifacts-are-stored-in-the-s3-buckets" id="faq-how-the-dbt-artifacts-are-stored-in-the-s3-buckets"></a>

In the S3 bucket you will see the `Bolt` folder which will contain all the artifacts by schedule name using the below folder structure.

### Example <a href="#example" id="example"></a>

```
bolt 
└─ run/
   ├─ bi_sales_hourly/
   ├─ bi_sales_daily/
   └─ daily_general/
      ├─ current # in this folder we will add the latest target folder for each command of the schedule
      │  ├─ 47531_dbt_seed/ 
      │  └─ 47532_dbt_run/ 
      │     └─ target/
      │         ├─ compiled/
      │         ├─ run/
      │         ├─ graph.gpickle
      │         ├─ manifest.json
      │         ├─ partial_parse.msgpack
      │         └─ run_results.json
      │   
      └─ 2022/ 
         └─ 09/
            ├─ 16/
            ├─ 17/
            └─ 18/
               └─ 58602_1663718545.541192/
                  ├─ 47531_dbt_seed/ 
                  └─ 47532_dbt_run/ 
                     └─ target/
                        ├─ compiled/
                        ├─ run/
                        ├─ graph.gpickle
                        ├─ manifest.json
                        ├─ partial_parse.msgpack
                        └─ run_results.json
```
