We put forward a new family of computational assumptions, the Kernel Matrix Diffie-Hellman Assumption. Given some matrix A sampled from some distribution D, the kernel assumption says that it is hard to find “in the exponent” a nonzero vector in the kernel of AT. This family is a natural computational analogue of the Matrix Decisional Diffie-Hellman Assumption (MDDH), proposed by Escala et al. As such it allows to extend the advantages of their algebraic framework to computational assumptions. The k-Decisional Linear Assumption is an example of a family of decisional assumptions of strictly increasing hardness when k grows. We show that for any such family of MDDH assumptions, the corresponding Kernel assumptions are also strictly increasingly weaker. This requires ruling out the existence of some black-box reductions between flexible problems (i.e., computational problems with a non unique solution).
CITATION STYLE
Morillo, P., Ràfols, C., & Villar, J. L. (2016). The kernel matrix Diffie-Hellman assumption. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10031 LNCS, pp. 729–758). Springer Verlag. https://doi.org/10.1007/978-3-662-53887-6_27
Mendeley helps you to discover research relevant for your work.