This article is more than 1 year old

Clusters f**ked: Insecure Hadoop file systems wiped by miscreants

Weak default settings attract data deletion attacks despite warnings

Administrators of Hadoop Distributed File System (HDFS) clusters have evidently not heeded warnings that surfaced last month about securing software with insecure default settings.

Attacks on Hadoop clusters have wiped the data of at least 165 installations, according to GDI Foundation security researchers Victor Gevers, Niall Merrigan, and Matt Bromiley. The trio report that 5,300 Hadoop clusters are presently exposed to the internet, some of which may be vulnerable.

"The default installation for HDFS Admin binds to the IP address 0.0.0.0 and allows any unauthenticated user to perform super user functions to a Hadoop cluster," the group's report states. "These functions can be performed via a web browser, and do not prevent an attacker from destructive actions. This may include destroying data nodes, data volumes, or snapshots with terabytes of data in seconds."

A previous round of attacks hit Hadoop clusters, via port 50070, last month, as The Register noted.

At the time, one attacker was spotted erasing file directories and adding a new directory titled, /NODATA4U_SECUREYOURSHIT.

Those conducting the current round of attacks have also deleted data while placing a directory titled /PLEASE_README.

This would be an obvious place to put a ransom note that promises to restore deleted data in exchange for payment. Gevers told El Reg he couldn't find a ransom note so this attack looks like vandalism. "Or this last attack only created an empty directory and they forgot to place the note," he added. "We have seen these misfire attacks before on MongoDB."

The researchers suggest that while attackers may demand ransom payments, they don't have anything to offer in return. "Victims who have paid ransom prices have not received data in return, and are often left without a means to recover," the report states.

The Hadoop attacks echo those that have affected MongoDB and Elasticsearch instances.

The researchers predict it will not be long before HDFS is subject to more intensive ransomware attacks.

They advise turning on Hadoop Secure Datanode, Safemode, and service level authentication (via Kerberos). They also recommend blocking port 50070 from untrusted IPs, adding IAM control and network segmentation via some form of OpenVPN, and implementing a reverse proxy, such as Knox, to defend against unauthorized access.

Or you could just leave the door open and hope no one walks away with your data. ®

More about

TIP US OFF

Send us news


Other stories you might like