Original URL: https://www.theregister.com/2000/07/27/internet_could_be_500_times/

Internet could be 500 times bigger than we think

And very easy to kick over

By Lucy Sherriff

Posted in Legal, 27th July 2000 14:51 GMT

The true extent of the Internet is not widely known, and according to a study published this week, it could be more than 500 times larger than we think. The authors claim that as much as 7500TB of data exist in places on the Web that no search engine has mapped, as compared to the 19TB on the familiar "surface" Web.

The material BrightPlanet has uncovered consists mostly of public information - 95 per cent of this is freely accessible, with more than half residing in topic-specific databases. Of these, the 60 largest contain 750TB of information. This exceeds the capacity of the "normal" Web by 40 times.

All this hidden material is causing a great deal of frustration, the company says, as people can't find it using the usual search engines. Even NorthernLight.com, widely reported to have the largest percentage of the web mapped at 16 per cent, covers only 0.03 per cent of total content including the "deep" Web.

Fortunately, and perhaps unsurprisingly, BrightPlanet has the solution in the form of its own new software called Lexibot. This will search the surface Web, and will access the online databases to search for information there as well.

However, this does not happen fast. In fact shifting continents can just about keep up. It will take on average 10 to 25 minutes to fill a search request, the executives at the company estimate. But complex queries could take anything up to an hour and a half. And uniquely, as far as we know, the software also costs money. More specifically, it costs $89.95 following a free 30-day trial.

In the same week, New Scientist reports that the Internet is not just bigger than we think, but also more vulnerable to sabotage than we imagine.

According to a mathematical model published this week, if the right nodes were targeted, the network would quickly break down into isolated pieces and stop working. However, the Internet could withstand 18 per cent of its nodes being taken down randomly.

The study was investigating the differences between an exponential network and one that is scale-free, like the Internet. The exponential network under random attack loses performance quickly since all the nodes are equally important. A scale-free network is far more robust in this respect. But when it comes to targeted attacks, the exponential network has no obvious weak point to shoot for, and handles such an assault far better.

Researchers say that this means network operators can focus resources to provide security for the really vital parts of the Internet's network such as the backbone systems that carry most of the traffic.®

Related links

The search for the perfect search engine