Google BigQuery Integration
Google BigQuery is a fully-managed, server-less data warehouse that supports Structured Query Language (SQL) built on top of the Google Cloud Platform
Pre-requisites
- A user with BigQuery Admin and Storage Admin roles for the Google BigQuery project
- An active Billing account associated with your BigQuery project
Steps
- Step 1: Whitelist Locale’s IP Addresses
- Step 2: Create a User and Grant Privileges
- Step 3: Retrieve the Hostname and Port Number (Optional)
- Step 4: Configure BigQuery as a Data Connection on Locale
Step 1: Create a Service account
- Log in to the Google Cloud Platform with an
Admin
or higher role and open the API Credentials Page. If prompted, select or create a project. - Click the “Create credentials” button. On the dropdown that appears, chose “Service account key”
![](https://cdn.prod.website-files.com/622aca98e7c65a16d2e273fa/662f7398b8e3a046d7e7b4fa_Screenshot_2022-09-14_at_2.32.58_PM.png)
3. Provide the Service account name,
Service account id
(Auto populated), and description and click on Create and Continue
.
![](https://cdn.prod.website-files.com/622aca98e7c65a16d2e273fa/662f7398b8e3a046d7e7b4f4_Screenshot%2B2022-09-14%2Bat%2B2.35.54%2BPM.png)
4. On the following page, Use the Big Query Admin
role as indicated below
5. Alternatively, you can create a custom role and attach it to the Service account. Following are the permissions that you need to have in that role.
bigquery.jobs.create
bigquery.jobs.get
bigquery.jobs.update
bigquery.datasets.get
bigquery.tables.list
bigquery.tables.get
bigquery.tables.getData
![](https://cdn.prod.website-files.com/622aca98e7c65a16d2e273fa/662f7398b8e3a046d7e7b507_Screenshot%2B2022-09-14%2Bat%2B3.35.38%2BPM.png)
Step 2: Get Credentials for your service account
- Click on
[Service Accounts](<https://console.cloud.google.com/iam-admin/serviceaccounts>)
page and select the createdService Account
. - Head to the
Keys
section and then click onAdd Key
. Under key type, select JSON and hit “Create” - A
.json
file will then download to your computer. Use this when setting up your Data Source.
Step 3: Get your Dataset ID and Location
- Go to your BigQuery instance.
- Select the Project ID.
- From the list of datasets for the project, choose the one you want the dataset ID and location.
- Copy the Dataset ID and Data Location. The dataset ID is displayed as
project-name:dataset-ID
. Copy only the dataset ID. For example, in the image shown below, the dataset ID istest-dataset
.
![](https://cdn.prod.website-files.com/622aca98e7c65a16d2e273fa/662f7398b8e3a046d7e7b501_image.jpeg)
Step 4: Setting up the Connection
- Click on Settings → Data Connections → New Data Connection
![](https://cdn.prod.website-files.com/622aca98e7c65a16d2e273fa/662f7398b8e3a046d7e7b50e_Untitled.png)
2. Select BigQuery
under Data Connection to create a new DataSource Connection
Name of the Datasource
: A unique name to identify the data source. You could have multiple databases/clusters connecting to Locale so you can use this name to differentiate between each of them within Locale’s platform uniquely.Project ID
: It is a globally unique identifier for your project where all your datasets and data tables are stored.- To learn more about creating and managing your projects in the Google Cloud Platform, follow this documentation: https://cloud.google.com/resource-manager/docs/creating-managing-projects
Service Account Key
: Upload your Service Account Key(JSON FILE) to give locale necessary permissions to hit queries on your databases to trigger alerts and create incidents on top of them. SeeCreating a Service Account
for more details.
![](https://cdn.prod.website-files.com/622aca98e7c65a16d2e273fa/662f7398b8e3a046d7e7b504_Untitled%2B%25281%2529.png)
Note : Locale will cancel queries that run for more than 90 seconds. This is to protect your database from running rogue queries and also to prevent a backlog of alerts in the Locale system. If you have a use case where this needs to be increased then please get in touch with our Support.
Happy Alerting 🙌🏻