GCP Introduction
GCP Pricing
GCP Threats
GCP Misconfigurations
- Getting Started with GCP Audit
- CloudSql Audit
- Cloud Tasks Monitoring
- Dataflow Monitoring
- Function Monitoring
- Monitoring Compliance
- PubSubLite Monitoring
- Spanner Monitoring
- NoSQL Monitoring
- Compute Audit
- IAM Audit
- BigQuery Monitoring
- CDN Monitoring
- DNS Monitoring
- KMS Monitoring
- Kubernetes Audit
- Load Balancer Monitoring
- Log Monitoring
- Storage Audit
- Pub/Sub Monitoring
- VPC Audit
- IAM Deep Dive
GCP Threats
Dataflow Job CMK Keys Should Be Set
More Info:
Ensure Dataflow jobs are encrypted with customer managed kyes
Risk Level
Low
Address
Reliability, Security
Compliance Standards
SOC2, GDPR, ISO27001, HIPAA, HITRUST, NISTCSF, PCIDSS
Triage and Remediation
Remediation
To remediate the “Dataflow Job CMK Keys Should Be Set” misconfiguration in GCP using the GCP console, please follow these steps:
- Open the Google Cloud Console and go to the Dataflow page.
- Select the Dataflow job that is affected by the misconfiguration.
- Click on the “Edit” button to edit the job configuration.
- Scroll down to the “Security” section of the configuration page.
- Under the “Encryption” section, select “Customer-managed key” from the “Key source” dropdown menu.
- Choose the appropriate Cloud KMS key that you want to use to encrypt your data.
- Click on the “Save” button to save the changes.
After following these steps, your Dataflow job will be configured to use customer-managed keys for encryption.
To remediate the “Dataflow Job CMK Keys Should Be Set” misconfiguration in GCP using GCP CLI, you can follow the below steps:
-
Open the Cloud Shell in your GCP Console.
-
Run the following command to set the Cloud Key Management Service (KMS) key for Dataflow:
gcloud dataflow jobs update <JOB_ID> --update-kms-key=<KMS_KEY>
Replace
<JOB_ID>
with the ID of the Dataflow job and<KMS_KEY>
with the ID or fully qualified name of the Cloud KMS key to be used for encrypting the Dataflow job’s temporary files. -
Verify that the KMS key has been set by running the following command:
gcloud dataflow jobs describe <JOB_ID> | grep kmsKeyName
This command should return the name of the KMS key that was set in step 2.
-
Repeat steps 2 and 3 for all Dataflow jobs that require a KMS key to be set.
By following these steps, you can remediate the “Dataflow Job CMK Keys Should Be Set” misconfiguration in GCP using GCP CLI.
To remediate the misconfiguration “Dataflow Job CMK Keys Should Be Set” for GCP using python, you can follow the below steps:
- First, you need to create a Cloud KMS key ring and key in the same region as your Dataflow job. You can use the following code to create a key ring and key:
from google.cloud import kms_v1
# Replace <project-id> with your GCP project ID
# Replace <key-ring-name> with the name of the key ring you want to create
# Replace <key-name> with the name of the key you want to create
project_id = "<project-id>"
key_ring_name = "<key-ring-name>"
key_name = "<key-name>"
# Create the Cloud KMS client
client = kms_v1.KeyManagementServiceClient()
# Create the key ring
parent = f"projects/{project_id}/locations/{location}"
key_ring = client.create_key_ring(request={"parent": parent, "key_ring_id": key_ring_name})
# Create the key
purpose = kms_v1.CryptoKey.CryptoKeyPurpose.ENCRYPT_DECRYPT
crypto_key = {"purpose": purpose}
response = client.create_crypto_key(request={"parent": key_ring.name, "crypto_key_id": key_name, "crypto_key": crypto_key})
- Once you have created the key ring and key, you can update your Dataflow job to use the key. You can use the following code to update your job:
from googleapiclient.discovery import build
from oauth2client.client import GoogleCredentials
# Replace <project-id> with your GCP project ID
# Replace <job-id> with the ID of the Dataflow job you want to update
# Replace <location> with the region where your job is running
# Replace <key-ring-name> with the name of the key ring you created
# Replace <key-name> with the name of the key you created
project_id = "<project-id>"
job_id = "<job-id>"
location = "<location>"
key_ring_name = "<key-ring-name>"
key_name = "<key-name>"
# Authenticate with GCP
credentials = GoogleCredentials.get_application_default()
service = build('dataflow', 'v1b3', credentials=credentials)
# Get the current job configuration
job = service.projects().locations().jobs().get(projectId=project_id, location=location, jobId=job_id).execute()
# Update the job configuration to use the Cloud KMS key
job['environment']['userAgent']['additionalUserAgent'] = 'dataflow-kms-sample'
job['environment']['workerPools'][0]['workerHarnessContainerImage'] = 'gcr.io/dataflow-kms-cloud/dataflow-kms-sample:latest'
job['environment']['workerPools'][0]['environment']['KMS_KEY_NAME'] = f"projects/{project_id}/locations/{location}/keyRings/{key_ring_name}/cryptoKeys/{key_name}"
# Update the job
request = service.projects().locations().jobs().update(projectId=project_id, location=location, jobId=job_id, body=job)
response = request.execute()
- Finally, you can verify that the job is using the Cloud KMS key by checking the logs. You should see a message similar to the following:
INFO:root:Using Cloud KMS key projects/<project-id>/locations/<location>/keyRings/<key-ring-name>/cryptoKeys/<key-name> to encrypt data
By following these steps, you should be able to remediate the misconfiguration “Dataflow Job CMK Keys Should Be Set” for GCP using python.