Analysing Web graphs meets a difficulty in the necessity of storing a major part of huge graphs in the external memory, which prevents efficient random access to edge (hyperlink) lists. A number of algorithms involving compression techniques have thus been presented, to represent Web graphs succinctly but also providing random access. Our algorithm belongs to this category. It works on contiguous blocks of adjacency lists, and its key mechanism is merging the block into a single ordered list. This method achieves compression ratios much better than most methods known from the literature at rather competitive access times. © 2011 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Grabowski, S., & Bieniecki, W. (2011). Merging adjacency lists for efficient web graph compression. Advances in Intelligent and Soft Computing, 103, 385–392. https://doi.org/10.1007/978-3-642-23169-8_42
Mendeley helps you to discover research relevant for your work.