One trillion unique Web URLs are out there
- 27 July, 2008 22:00
In a discovery that would probably send the Dr. Evil character of the "Austin Powers" movies into cardiac arrest, Google recently detected more than a trillion unique URLs on the Web.
This milestone awed Google search engineers, who are seeing the Web growing by several billion individual pages every day, company officials wrote in a blog post.
In addition to announcing this finding, Google took the opportunity to promote the scope and magnitude of its index.
"We don't index every one of those trillion pages -- many of them are similar to each other, or represent auto-generated content ... that isn't very useful to searchers. But we're proud to have the most comprehensive index of any search engine, and our goal always has been to index all the world's data," wrote Jesse Alpert and Nissan Hajaj, software engineers in Google's Web Search Infrastructure Team.
It had been a while since Google had made public pronouncements about the size of its index, a topic that routinely generated controversy and counterclaims among the major search engine players years ago.
Those days of index-size envy ended when it became clear that most people rarely scan more than two pages of Web results. In other words, what matters is delivering 10 or 20 really relevant Web links, or, even better, a direct factual answer, because few people will wade through 5,000 results to find the desired information.
It will be interesting to see if this announcement from Google, posted on its main official blog, will trigger a round of reactions from rivals like Yahoo, Microsoft and Ask.com.
In the meantime, Google also disclosed interesting information about how and with what frequency it analyzes these links.
"Today, Google downloads the web continuously, collecting updated page information and re-processing the entire web-link graph several times per day. This graph of one trillion URLs is similar to a map made up of one trillion intersections. So multiple times every day, we do the computational equivalent of fully exploring every intersection of every road in the United States. Except it'd be a map about 50,000 times as big as the U.S., with 50,000 times as many roads and intersections," the officials wrote.