We consider networks of Markov Decision Processes (MDPs) where identical MDPs are placed on N nodes of a graph G. The transition probabilities of an MDP depend on the states of its direct neighbors in the graph, and runs operate by selecting a random node and following a random transition in the chosen device MDP. As the state space of all the configurations of the network is exponential in N, classical analysis are unpractical. We study how a polynomial size statistical representation of the system, which gives the densities of the subgraphs of width k, can be used to analyze its behaviors, generalizing the approximate Model Checking of an MDP. We propose a Structured Population Protocol as a new Population MDP where states are statistical representations of the network, and transitions are inferred from the statistical s tructure. Our main results show that for some large networks, the distributions of probability of the statistics vectors of the population MDP approximate the distributions of probability of the statistics vectors of the real process. Moreover, when the network has some regularity, both real and approximation processes converge to the same distributions. © 2013 Springer-Verlag.
CITATION STYLE
De Rougemont, M., & Tracol, M. (2013). Algebraic Informatics. (T. Muntean, D. Poulakis, & R. Rolland, Eds.), Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8080, pp. 199–210). Berlin, Heidelberg: Springer Berlin Heidelberg. Retrieved from http://www.scopus.com/inward/record.url?eid=2-s2.0-84884722062&partnerID=tZOtx3y1
Mendeley helps you to discover research relevant for your work.