Load balancing using consistent hashing: A real challenge for large scale distributed web crawlers

Mitra Nasri, Mohsen Sharifi

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

3 Citations (Scopus)

Abstract

Large scale search engines nowadays use distributed Web crawlers to collect Web pages because it is impractical for a single machine to download the entire Web. Load balancing of such crawlers is an important task because of limitations in memory/resources of each crawling machine. Existing distributed crawlers use simple URL hashing based on site names as their partitioning policy. This can be done in a distributed environment using consistent hashing to dynamically manage joining and leaving of crawling nodes. This method is formally claimed to be load balanced in cases that hashing method is uniform. Given that the Web structure abides by power law distribution according to existing statistics, we argue that it is not at all possible for a uniform random hash function based on site's URL to be load balanced for case of large scale distributed Web crawlers. We show the truth of this claim by applying Web statistics to consistent hashing as it is used in one of famous Web crawlers. We also report some experimental results to demonstrate the effect of load balancing when we just rely on hash of host names.

Original languageEnglish
Title of host publicationInternational Conference on Advanced Information Networking and Applications Workshops, WAINA 2009
Pages715-720
Number of pages6
DOIs
Publication statusPublished - 2009
Externally publishedYes
Event2009 International Conference on Advanced Information Networking and Applications Workshops, WAINA 2009 - Bradford, United Kingdom
Duration: 26 May 200929 May 2009

Publication series

NameProceedings - International Conference on Advanced Information Networking and Applications, AINA
ISSN (Print)1550-445X

Conference

Conference2009 International Conference on Advanced Information Networking and Applications Workshops, WAINA 2009
Country/TerritoryUnited Kingdom
CityBradford
Period26/05/0929/05/09

Fingerprint

Dive into the research topics of 'Load balancing using consistent hashing: A real challenge for large scale distributed web crawlers'. Together they form a unique fingerprint.

Cite this