Consider a set of users and servers connected by a network. Each server provides a unique service which is of certain benefit to each user. Now comes an attacker, who wishes to destroy a set of edges of the network in the fashion that maximizes his net gain, namely, the total disconnected benefit of users minus the total edge-destruction cost. We first discuss that the problem is polynomially solvable in the single-server case. In the multiple-server case, we will show, the problem is, however, N P-hard. In particular, when there are only two servers, the network disconnection problem becomes intractable. Then a 3/2-approximation algorithm is developed for the two-server case. © Springer-Verlag Berlin Heidelberg 2006.
CITATION STYLE
Choi, B. C., & Hong, S. P. (2006). Two-server network disconnection problem. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3982 LNCS, pp. 785–792). Springer Verlag. https://doi.org/10.1007/11751595_83
Mendeley helps you to discover research relevant for your work.