Algebraic Informatics

  • De Rougemont M
  • Tracol M
ISSN: 03029743
N/ACitations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

We consider networks of Markov Decision Processes (MDPs) where identical MDPs are placed on N nodes of a graph G. The transition probabilities of an MDP depend on the states of its direct neighbors in the graph, and runs operate by selecting a random node and following a random transition in the chosen device MDP. As the state space of all the configurations of the network is exponential in N, classical analysis are unpractical. We study how a polynomial size statistical representation of the system, which gives the densities of the subgraphs of width k, can be used to analyze its behaviors, generalizing the approximate Model Checking of an MDP. We propose a Structured Population Protocol as a new Population MDP where states are statistical representations of the network, and transitions are inferred from the statistical s tructure. Our main results show that for some large networks, the distributions of probability of the statistics vectors of the population MDP approximate the distributions of probability of the statistics vectors of the real process. Moreover, when the network has some regularity, both real and approximation processes converge to the same distributions. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

De Rougemont, M., & Tracol, M. (2013). Algebraic Informatics. (T. Muntean, D. Poulakis, & R. Rolland, Eds.), Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8080, pp. 199–210). Berlin, Heidelberg: Springer Berlin Heidelberg. Retrieved from http://www.scopus.com/inward/record.url?eid=2-s2.0-84884722062&partnerID=tZOtx3y1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free