Feeds

Insecure indexing risk dissected

How did THAT get out?

  • alert
  • submit to reddit

Choosing a cloud hosting partner with confidence

It's embarrassing when future PR items, upcoming security advisories or boilerplates for obituaries that are not meant to be visible to external users drift into the public domain. These documents might get accidentally uploaded to the wrong part of a website but mischievous attacks can also play a role.

Web application security researcher Amit Klein this week published a paper explaining how "insecure indexing" allows attackers to expose hidden files on web servers. Some site-installed search engines index files that search engines are programmed to ignore. Typically search engines look in a root domain for a special file called "robots.txt" which tells the robot (spider) which files it may download.

If an attacker can get to internal search engines he can get around files denied to him by the Robots Exclusion Standard. Klein explains that these attacks are "fundamentally different from exploiting external (remote) search engines".

Klein explains various attack techniques, ranging all the way from guessing a file name from names that already exist to targeted search strings and far more complicated traffic-intensive attacks, and concludes with methods for detecting insecure indexing and suggested defences. "Crawling style indexing should be preferred over direct file indexing. If file-level indexing cannot be avoided, more consideration should be made when deploying a search engine that facilitates it. In particular those search engines should be systematically limited to the visible resources (or at the very least, to accessible resources)," he writes.

The paper - Insecure Indexing Vulnerability: Attacks Against Local Search Engines - can be found on the Web Application Security Consortium's site here. ®

Related stories

Botnets strangle Google Adwords campaigns
Phishers suspected of eBay Germany domain hijack
Interview with a link spammer
Google's No-Google tag blesses the Balkanized web
Google exposes web surveillance cams
Major flaw found in Google Desktop

Top 5 reasons to deploy VMware with Tegile

More from The Register

next story
You really need to do some tech support for Aunty Agnes
Free anti-virus software, expires, stops updating and p0wns the world
Regin: The super-spyware the security industry has been silent about
NSA fingered as likely source of complex malware family
You stupid BRICK! PCs running Avast AV can't handle Windows fixes
Fix issued, fingers pointed, forums in flames
Privacy bods offer GOV SPY VICTIMS a FREE SPYWARE SNIFFER
Looks for gov malware that evades most antivirus
Patch NOW! Microsoft slings emergency bug fix at Windows admins
Vulnerability promotes lusers to domain overlords ... oops
HACKERS can DELETE SURVEILLANCE DVRS remotely – report
Hikvision devices wide open to hacking, claim securobods
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?