Feeds

Internet could be 500 times bigger than we think

And very easy to kick over

  • alert
  • submit to reddit

Secure remote control for conventional and virtual desktops

The true extent of the Internet is not widely known, and according to a study published this week, it could be more than 500 times larger than we think. The authors claim that as much as 7500TB of data exist in places on the Web that no search engine has mapped, as compared to the 19TB on the familiar "surface" Web.

The material BrightPlanet has uncovered consists mostly of public information - 95 per cent of this is freely accessible, with more than half residing in topic-specific databases. Of these, the 60 largest contain 750TB of information. This exceeds the capacity of the "normal" Web by 40 times.

All this hidden material is causing a great deal of frustration, the company says, as people can't find it using the usual search engines. Even NorthernLight.com, widely reported to have the largest percentage of the web mapped at 16 per cent, covers only 0.03 per cent of total content including the "deep" Web.

Fortunately, and perhaps unsurprisingly, BrightPlanet has the solution in the form of its own new software called Lexibot. This will search the surface Web, and will access the online databases to search for information there as well.

However, this does not happen fast. In fact shifting continents can just about keep up. It will take on average 10 to 25 minutes to fill a search request, the executives at the company estimate. But complex queries could take anything up to an hour and a half. And uniquely, as far as we know, the software also costs money. More specifically, it costs $89.95 following a free 30-day trial.

In the same week, New Scientist reports that the Internet is not just bigger than we think, but also more vulnerable to sabotage than we imagine.

According to a mathematical model published this week, if the right nodes were targeted, the network would quickly break down into isolated pieces and stop working. However, the Internet could withstand 18 per cent of its nodes being taken down randomly.

The study was investigating the differences between an exponential network and one that is scale-free, like the Internet. The exponential network under random attack loses performance quickly since all the nodes are equally important. A scale-free network is far more robust in this respect. But when it comes to targeted attacks, the exponential network has no obvious weak point to shoot for, and handles such an assault far better.

Researchers say that this means network operators can focus resources to provide security for the really vital parts of the Internet's network such as the backbone systems that carry most of the traffic.®

Related links

The search for the perfect search engine

Internet Security Threat Report 2014

Whitepapers

Why cloud backup?
Combining the latest advancements in disk-based backup with secure, integrated, cloud technologies offer organizations fast and assured recovery of their critical enterprise data.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?
Protecting users from Firesheep and other Sidejacking attacks with SSL
Discussing the vulnerabilities inherent in Wi-Fi networks, and how using TLS/SSL for your entire site will assure security.
Saudi Petroleum chooses Tegile storage solution
A storage solution that addresses company growth and performance for business-critical applications of caseware archive and search along with other key operational systems.