This article is more than 1 year old

Crackers use search engines to exploit weak sites

People post some incredibly dumb things on the Web

The recent proliferation of point-and-drool GUI utilities for brute-force password cracking has led many crackers and Script Kiddies to overlook a powerful and quite obvious tool available to all, the common search engine.

With a bit of ingenuity, anyone can skirt basic password authentication and go straight to the goodies on those sites where administrators are foolish enough to post them. If the desired information is contained in a Web page, anyone can find it.

One security enthusiast with whom The Register is warmly acquainted named 'Utreg' has found this a convenient shortcut.

"HotBot advanced search allows you to specify your search with file extensions, looking for sites or directories that include .dat files and the words 'index of' and 'admin' or 'customer'," Utreg says.

He showed us a file named data.txt on ISP Lanline.com's servers which contained the personal information of several hundred people, including their names, addresses, social security numbers and credit card account details - and all of it in plain text.

We rang Lanline to get to the bottom of it. They discovered that the information belonged to a commercial site which they had once hosted. When the Web site owners moved or packed up, they carelessly left their Web pages, including several highly confidential ones, behind on Lanline's server.

The information had originally been generated by some sort of shopping-cart application, a Lanline spokesperson told The Register. The data has since been removed.

Ease of use

By clicking the 'advanced search' button on the HotBot main page, one is offered a number if intriguing options. No need to be a whiz with Boolean operators; a nice CGI menu is provided. Enter the words 'admin' and 'user' and tick the 'file types' box for the extension .dat. It works nicely.

"That's the scary thing... it's just so bloody simple, any fourteen year old can do it," Utreg told The Register. "The possibilities look unlimited; the only restriction is your own creativity."

HotBot parent Lycos told The Register that they have no intention of modifying the search-engine's capabilities to block sensitive file types.

"We're concerned that people are putting sensitive data on their Web sites," Lycos told us. But file-type searching is a useful feature; and it is ultimately the obligation of operators to secure their data by not maintaining it on a public Web site, the company notes.

Get an education

For those who don't fully grasp the potential of search engines, and are at a loss to guess which files and directories they might wish to search for, many Web sites are conveniently set up with a useful file that will help one get started.

Our friend 'fravia+' recommends searching for this file, called robots.txt, in the main directory of a target site, by entering a URL with the following pattern: http://www.targetsite.com/robots.txt. The robots.txt file is used to tell search engines which directories and files they should not index.

Nothing listed in a 'robots.txt' file will turn up in a search query; but once a person has seen the directory and file names it contains, they can type them directly into their browser to access the various subdirectories and pages which the site administrators would rather keep hidden. These are of course the very subdirectories and files most likely to be of interest to crackers.

The fravia+ Web site contains an extensive treasury of educational material for those who wish to extract the maximum performance from search engines.

For Web site operators afraid of falling prey to such backdoor inquiries, the solution is painfully obvious and quite simple. Stop putting sensitive data in public places. A file which you would not print out and post on a billboard simply has no business being posted on a Web site. ®

More about

TIP US OFF

Send us news


Other stories you might like