Abstract
Search engines utilize numerous measures to rank the webpages in the search results. At Google, the so-called PageRank algorithm provides crucial information by quantifying the importance of webpages. The PageRank values are considered to be objective since the algorithm requires only the data of the web link structure. In this paper, we first introduce the original PageRank algorithm and give a brief overview of the distributed randomized approach recently proposed by the authors. Then, we consider the problem of computing the variation in the PageRank values that can be caused by erroneous data in the web structure when some links are fragile. An efficient (centralized) algorithm in the form of linear programming is proposed. This approach employs results from the field of interval matrices. Numerical examples are given to verify the effectiveness of the approach.
Cite
CITATION STYLE
Ishii, H., & Tempo, R. (2009). Computing the PageRank Variation for Fragile Web Data. SICE Journal of Control, Measurement, and System Integration, 2(1), 1–9. https://doi.org/10.9746/jcmsi.2.1
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.