Towards Differentially Private Aggregation of Heterogeneous Robots

2Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We are interested in securing the operation of robot swarms composed of heterogeneous agents that collaborate by exploiting aggregation mechanisms. Since any given robot type plays a role that may be critical in guaranteeing continuous and failure-free operation of the system, it is beneficial to conceal individual robot types and, thus, their roles. In our work, we assume that an adversary gains access to a description of the dynamic state of the swarm in its non-transient, nominal regime. We propose a method that quantifies how easy it is for the adversary to identify the type of any of the robots, based on this observation. We draw from the theory of differential privacy to propose a closed-form expression of the leakage of the system at steady-state. Our results show how this model enables an analysis of the leakage as system parameters vary; they also indicate design rules for increasing privacy in aggregation mechanisms.

Cite

CITATION STYLE

APA

Prorok, A., & Kumar, V. (2018). Towards Differentially Private Aggregation of Heterogeneous Robots. In Springer Proceedings in Advanced Robotics (Vol. 6, pp. 587–601). Springer Science and Business Media B.V. https://doi.org/10.1007/978-3-319-73008-0_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free