GCP Introduction
GCP Pricing
GCP Threats
GCP Misconfigurations
- Getting Started with GCP Audit
- CloudSql Audit
- Cloud Tasks Monitoring
- Dataflow Monitoring
- Function Monitoring
- Monitoring Compliance
- PubSubLite Monitoring
- Spanner Monitoring
- NoSQL Monitoring
- Compute Audit
- IAM Audit
- BigQuery Monitoring
- CDN Monitoring
- DNS Monitoring
- KMS Monitoring
- Kubernetes Audit
- Load Balancer Monitoring
- Log Monitoring
- Storage Audit
- Pub/Sub Monitoring
- VPC Audit
- IAM Deep Dive
GCP Threats
Hadoop HDFS Port Should Not Be Open
More Info:
Determines if TCP port 50070 and 50470 for Hadoop/HDFS NameNode WebUI service is open to the public
Risk Level
Medium
Address
Security
Compliance Standards
CBP
Triage and Remediation
Remediation
To remediate the misconfiguration “Hadoop HDFS Port Should Not Be Open” for GCP using GCP console, follow the below steps:
- Log in to your GCP console (https://console.cloud.google.com/).
- Select the project where the Hadoop HDFS port is open.
- Click on the “Navigation menu” on the top left corner and select “Compute Engine”.
- In the Compute Engine dashboard, click on “VM instances” to see all the instances.
- Select the instance where the Hadoop HDFS port is open.
- Click on the “Edit” button at the top of the page.
- Scroll down to the “Firewall” section and click on “Management, security, disks, networking, sole tenancy”.
- In the “Firewall” section, click on “Networking”.
- In the “Firewall rules” section, click on the “Edit” button next to the firewall rule that allows the Hadoop HDFS port.
- In the “Edit firewall rule” dialog box, change the “Action” to “Deny”.
- Click on the “Save” button to save the changes.
- Verify that the Hadoop HDFS port is no longer open by running a port scan on the instance.
By following these steps, you have successfully remediated the misconfiguration “Hadoop HDFS Port Should Not Be Open” for GCP using GCP console.
To remediate the misconfiguration “Hadoop HDFS Port Should Not Be Open” for GCP using GCP CLI, follow these steps:
-
Open the Cloud Shell from the GCP console.
-
Run the following command to list all the instances in your project:
gcloud compute instances list
-
Identify the instance that has the Hadoop HDFS port open.
-
Run the following command to SSH into the instance:
gcloud compute ssh [INSTANCE_NAME]
Replace [INSTANCE_NAME] with the name of the instance.
-
Once you are logged in to the instance, run the following command to stop the Hadoop HDFS service:
sudo systemctl stop hadoop-hdfs-datanode
-
Run the following command to disable the Hadoop HDFS service:
sudo systemctl disable hadoop-hdfs-datanode
-
Edit the Hadoop HDFS configuration file to remove the port configuration. The configuration file is usually located at
/etc/hadoop/conf/hdfs-site.xml
. -
Save the changes and exit the editor.
-
Run the following command to start the Hadoop HDFS service:
sudo systemctl start hadoop-hdfs-datanode
-
Finally, run the following command to enable the Hadoop HDFS service to start automatically on boot:
sudo systemctl enable hadoop-hdfs-datanode
-
Exit the SSH session by running the following command:
exit
With these steps, you have successfully remediated the misconfiguration “Hadoop HDFS Port Should Not Be Open” for GCP using GCP CLI.
To remediate the Hadoop HDFS port being open in GCP using Python, follow these steps:
- Open the Google Cloud Console and navigate to the project that has the misconfiguration.
- Select the Compute Engine service from the left-hand menu.
- Select the VM instance that has the Hadoop HDFS port open.
- Click on the Edit button at the top of the page.
- Scroll down to the Firewall section and click on it.
- Click on the Add Firewall Rule button.
- In the Name field, enter a name for the firewall rule (e.g., “block-hadoop-hdfs-port”).
- In the Targets field, select the VM instance that has the Hadoop HDFS port open.
- In the Source IP ranges field, enter the IP addresses or ranges that should be blocked from accessing the Hadoop HDFS port. For example, you can enter “0.0.0.0/0” to block all IP addresses.
- In the Protocols and ports field, select “Specified protocols and ports” and enter the port number for the Hadoop HDFS port (default is 9000).
- Click on the Create button to create the firewall rule.
Once the firewall rule is created, it will block all incoming traffic to the Hadoop HDFS port from the specified IP addresses or ranges. This will remediate the misconfiguration and prevent unauthorized access to the Hadoop HDFS port.