Feeds

Pimplier Batgirls and Sawdusty Barmen

Whacking Google's wordlist woes

5 things you didn’t know about cloud backup

Earlier this week the NCSA released a study that attempted to compare the respective merits of Google and Yahoo!'s search engines. (See My spam-filled search index is bigger than yours!). Unfortunately, the only thing it proved was which search engine was publishing the most gibberish it had collected - a fact apparently lost on the researchers. The three academics insisted that because Google was returning more gibberish, it must be doing a better job.

Doh!

The phenomenon, we discover, is relatively recent, and it's an unintended consequence of both search engines trying to make their searches more comprehensive. The trouble is that Google is returning pages which are nothing but great long lists of words as valid search results, when rarely, if ever, is this what the searcher is looking for. Unless you have a thing for strange combinations of words.

But strangely enough, some people do have a thing for strange combinations of words - and this week, it's these very hobbyists who have been able to shed more light on the search giants' internal operations than the academics. It's Google Whacking - the art of finding two words that produce just a single result from the search engine - which comedy writer Dave Gorman turned into a book and a stage show. You can see a list of the most recently discovered Google Whacks here. Current Whacks at the time of writing include " rhubarb underkill", "oxymoronically flakier", "overpaid brainworkers" - somewhat surprisingly - and, to our relief, there's only one web page in the world with the words "subhuman stepsiblings" close enough together to merit a hit.

Don't expect these Whacks to work for long - as soon as they're repeated on a public forum, and Google finds them, they cease to be Whacks. Today's most delightful Whacks, like "pimplier batgirls" and "sawdusty barmen", are sure to be gone tomorrow.

Search guru Gary Stock, CTO of Nexcerpt, who coined the phrase, says things have been very strange indeed recently in the world of Google and Yahoo!

What the Whackers have highlighted is how the major search engines are struggling to cope with word lists - like this one - with Yahoo! doing a rather better job than Google. It's a consequence, he says, of both search engines removing the cap from the length of documents that they index.

You may have marvelled how so many web pages appeared to be precisely 101kb long. Well, for some years Google only indexed the first 101kb of a web page, and that's all you could see. A year ago Yahoo! lifted the cap, and more recently Google followed suit. The result has been to open the door to more gibberish.

"The contribution was distinct and dramatic," says Stock. "Googlewhackers were commenting on the change well before it was noted in most search forums. We're exposed to a lot of raw, peculiar stuff around the edges."

Google was already pretty good at recognizing duplicates, he points out. But not quite good enough. Surely this isn't rocket science to fix, we wondered?

"Google has the advantage of knowing precisely what is in their corpus, and being able to call upon vast research and statistical data about what is - and what is not - legitimate text. Wordlists look exactly like … lists of words!

"Here's a clue for low hanging fruit: legitimate text won't include a sequence of 1,000 words in alphabetical order - with no punctuation!"

"If Google wants to exclude other machine-generated texts (search 'hipcrime' or 'sporgeries' for many scary instances), that'll require a bit more effort. Again though, they have plenty of data to create useful profiles, and plenty of CPU to apply them."

"If Googlewhack's code can identify wordlists with a high degree of accuracy, then Google could readily make it part of The Machine."

That's a point made by reader Jeremy Pickens, although it may sound counter-intuitive on the first pass.

"Just because a page never gets returned in the top 1,000 doesn't mean there is no use in having that page in your index," he writes. "For example, if Yahoo! were to do some sort of statistical or structural analysis of those additional 10 billion spam pages, they might be able to use all that information to better detect/classify search spam in the future."

"So, kudos to Yahoo! for actually taking the disk space to store all that extra information, instead of just throwing it away."

Reader David notes, "The bottom line seems to be that Google is better at including obscure pages or Yahoo is better at rejecting useless pages and the NCSA excels at keeping incompetent PhDs funded."®

Bootnote: Martin Torzewski earning himself the last word with this thought. "Your reference to Borges prompted the thought that his story "The Library of Babel" might be relevant here!" Too true, alas.

Secure remote control for conventional and virtual desktops

More from The Register

next story
6 Obvious Reasons Why Facebook Will Ban This Article (Thank God)
Clampdown on clickbait ... and El Reg is OK with this
Kaspersky backpedals on 'done nothing wrong, nothing to fear' blather
Founder (and internet passport fan) now says privacy is precious
TROLL SLAYER Google grabs $1.3 MEEELLION in patent counter-suit
Chocolate Factory hits back at firm for suing customers
Facebook, Google and Instagram 'worse than drugs' says Miley Cyrus
Italian boffins agree with popette's theory that haters are the real wrecking balls
Mozilla's 'Tiles' ads debut in new Firefox nightlies
You can try turning them off and on again
Sit tight, fanbois. Apple's '$400' wearable release slips into early 2015
Sources: time to put in plenty of clock-watching for' iWatch
Facebook to let stalkers unearth buried posts with mobe search
Prepare to HAUNT your pal's back catalogue
Ex-IBM CEO John Akers dies at 79
An era disrupted by the advent of the PC
prev story

Whitepapers

Endpoint data privacy in the cloud is easier than you think
Innovations in encryption and storage resolve issues of data privacy and key requirements for companies to look for in a solution.
Implementing global e-invoicing with guaranteed legal certainty
Explaining the role local tax compliance plays in successful supply chain management and e-business and how leading global brands are addressing this.
Advanced data protection for your virtualized environments
Find a natural fit for optimizing protection for the often resource-constrained data protection process found in virtual environments.
Boost IT visibility and business value
How building a great service catalog relieves pressure points and demonstrates the value of IT service management.
Next gen security for virtualised datacentres
Legacy security solutions are inefficient due to the architectural differences between physical and virtual environments.