# BigQuery Tools

The BigQuery Tools allow DinoAI to explore your Google BigQuery account — listing projects, datasets, tables, and columns — and to analyse the performance of specific queries. This gives DinoAI the context it needs to help you write accurate SQL, build dbt models, and investigate cost or performance issues, all without leaving Paradime.

{% hint style="info" icon="plug-circle-plus" %}
**Requires a BigQuery connection.** These tools are only available when your workspace is connected to BigQuery. See your workspace settings to [configure a BigQuery credential](https://docs.paradime.io/app-help/documentation/settings/connections/development-environment/bigquery).
{% endhint %}

#### Capabilities

The BigQuery Tools give DinoAI the following abilities:

* List all projects and datasets in your BigQuery account
* List all tables within a given project and dataset
* Inspect column names, types, modes, descriptions, and nested fields for any table
* Retrieve full performance statistics for a specific BigQuery job, including bytes processed, slot usage, shuffle spill, and execution time

#### Using the BigQuery Tools

1. Open DinoAI in the right panel of the Code IDE
2. Describe what you want to explore or investigate (e.g., a table name, a query job ID, or a question about cost)
3. Add your prompt describing what you want DinoAI to do with that information
4. Grant permission when DinoAI asks to access your BigQuery account
5. Review the results and implement DinoAI's suggested actions

#### Example Use Cases

**Generating a dbt Source File**

**Prompt**

```
List the columns in the `my-project.raw_data.orders` table and generate a dbt sources.yml file for it.
```

**Result:** DinoAI fetches all column names, types, and descriptions from the BigQuery table schema and produces a ready-to-use `sources.yml` file with the correct structure, column definitions, and any available descriptions pre-filled.

**Investigating a Slow or Expensive Query**

**Prompt**

```
The job bquxjob_28b5b82e_19c066dea29 ran slowly yesterday. What was the issue?
```

**Result:** DinoAI queries `INFORMATION_SCHEMA.JOBS_BY_ORGANIZATION` for that job ID, surfaces bytes processed, slot consumption, shuffle spill, and execution time, then gives you a diagnosis and recommendations for optimisation.

**Exploring an Unfamiliar Dataset**

**Prompt**

```
What tables are in the `analytics` dataset in my `my-project` project?
```

**Result:** DinoAI lists every table in the dataset so you can orient yourself before writing queries or building models against it.

#### Working with Other Tools

The BigQuery Tools work well alongside DinoAI's other capabilities:

* Combine with the **dbt Tools** to inspect source tables and immediately scaffold dbt models or source definitions on top of them
* Combine with the **Catalog Search Tool** to cross-reference BigQuery table structure with existing dbt model documentation
* Combine with the **Column Level Lineage Tool** to trace how a specific column flows from a raw BigQuery table through your dbt transformations

#### Best Practices

* **Provide full identifiers** — BigQuery requires `project.dataset.table` — including the project name helps DinoAI navigate directly to the right resource without an extra lookup
* **Use the listing tools first** — If you're unsure of exact names, ask DinoAI to list projects, datasets, or tables before drilling into columns or running performance queries
* **Include a date range for performance queries** — The query performance tool defaults to a 7-day window; specifying `start_date` and `end_date` narrows results and speeds up the lookup
* **Check permissions** — DinoAI surfaces a `[ERROR]` if it lacks access to a project or dataset; confirm your BigQuery credential has the necessary IAM roles
