GCP Introduction
GCP Pricing
GCP Threats
GCP Misconfigurations
- Getting Started with GCP Audit
- CloudSql Audit
- Cloud Tasks Monitoring
- Dataflow Monitoring
- Function Monitoring
- Monitoring Compliance
- PubSubLite Monitoring
- Spanner Monitoring
- NoSQL Monitoring
- Compute Audit
- IAM Audit
- BigQuery Monitoring
- CDN Monitoring
- DNS Monitoring
- KMS Monitoring
- Kubernetes Audit
- Load Balancer Monitoring
- Log Monitoring
- Storage Audit
- Pub/Sub Monitoring
- VPC Audit
- IAM Deep Dive
GCP Threats
GCP BigQuery Should Have Audit Logging Enabled
More Info:
Ensure that BigQuery Audit Logging is configured properly across all projects.
Risk Level
Medium
Address
Security
Compliance Standards
HITRUST, SOC2, NISTCSF, PCIDSS
Triage and Remediation
Remediation
To remediate the misconfiguration “GCP BigQuery Should Have Audit Logging Enabled” for GCP using GCP console, you can follow the below steps:
-
Open the Google Cloud Console and select the project where BigQuery is enabled.
-
Go to the Navigation menu and select “BigQuery”.
-
In the BigQuery console, click on the “More” button (three dots) on the left-hand side and select “View in APIs Explorer”.
-
In the APIs Explorer, search for “tables.insert” in the search bar.
-
In the “tables.insert” API, scroll down to the “Request body” section and add the following JSON code:
{
"tableReference": {
"projectId": "project-id",
"datasetId": "dataset-id",
"tableId": "table-id"
},
"schema": {
"fields": [
{
"name": "column1",
"type": "STRING"
},
{
"name": "column2",
"type": "INTEGER"
}
]
},
"labels": {
"key": "value"
},
"timePartitioning": {
"type": "DAY",
"field": "timestamp"
}
}
-
Click on the “Authorize and execute” button.
-
On the next screen, click on the “Execute” button.
-
Go back to the BigQuery console and click on the “More” button (three dots) on the left-hand side.
-
Select “Audit logs” and ensure that the logs are enabled.
By following these steps, you will remediate the misconfiguration “GCP BigQuery Should Have Audit Logging Enabled” for GCP using GCP console.
To remediate the misconfiguration of GCP BigQuery not having audit logging enabled, follow these steps using GCP CLI:
-
Open the Cloud Shell in your GCP console.
-
Run the following command to verify if audit logging is enabled for BigQuery:
gcloud logging logs list | grep bigquery
If you see any results, it means audit logging is already enabled. If not, proceed to the next step.
- Run the following command to enable audit logging for BigQuery:
gcloud config set project <project-id>
gcloud services enable bigquery.googleapis.com
gcloud logging sinks create bigquery-audit \
bigquery.googleapis.com/activity \
--log-filter='protoPayload.serviceName="bigquery.googleapis.com"'
Note: Replace <project-id>
with your GCP project ID.
- Run the following command to verify if the sink was created successfully:
gcloud logging sinks list
- Run the following command to grant the necessary permissions to the sink:
gcloud projects add-iam-policy-binding <project-id> \
--member=serviceAccount:[email protected] \
--role=roles/bigquery.dataViewer
- Run the following command to create a dataset in BigQuery to store the audit logs:
bq mk <dataset-name>
Note: Replace <dataset-name>
with the desired name for your dataset.
- Run the following command to create a table in the dataset to store the audit logs:
bq mk --table <dataset-name>.<table-name> \
protoPayload \
'timestamp:TIMESTAMP,protoPayload:STRING,severity:STRING,logName:STRING,resource:STRUCT<type:string,labels:map<string,string>>,operation:STRUCT<id:string,producer:string,first:BOOL,last:BOOL>,trace:STRING,spanId:STRING,receiveTimestamp:TIMESTAMP'
Note: Replace <dataset-name>
and <table-name>
with the desired names for your dataset and table.
- Run the following command to create a sink to export the audit logs to the BigQuery table:
gcloud logging sinks create bigquery-audit \
bigquery.googleapis.com/activity \
--log-filter='protoPayload.serviceName="bigquery.googleapis.com"' \
--destination=<project-id>:<dataset-name>.<table-name> \
--include-children \
--format='json'
Note: Replace <project-id>
, <dataset-name>
and <table-name>
with the names you used in steps 6 and 7.
- Run the following command to verify if the sink was created successfully:
gcloud logging sinks describe bigquery-audit
After following these steps, audit logging will be enabled for BigQuery in your GCP project, and the audit logs will be exported to the BigQuery table you created.
To remediate the misconfiguration “GCP BigQuery should have audit logging enabled” for GCP using Python, follow the below steps:
- Install the Google Cloud SDK and authenticate using the following command:
gcloud auth login
- Install the necessary Python libraries:
pip install google-cloud-bigquery google-auth google-auth-oauthlib google-auth-httplib2
- Create a Python script with the following code:
from google.cloud import bigquery
client = bigquery.Client()
dataset_id = 'my_dataset'
dataset = client.get_dataset(dataset_id)
if not dataset.auditing_logs:
dataset.auditing_logs = True
dataset = client.update_dataset(dataset, ['auditing_logs'])
print('Audit logging enabled for dataset {}'.format(dataset_id))
-
Replace
my_dataset
with the ID of the dataset you want to enable audit logging for. -
Run the script using the following command:
python script.py
This will enable audit logging for the specified dataset in GCP BigQuery.