In this paper we show that a large class of diverse problems have a bicomposite structure which makes it possible to solve them with a new type of algorithm called dissection, which has much better time/memory tradeoffs than previously known algorithms. A typical example is the problem of finding the key of multiple encryption schemes with r independent n-bit keys. All the previous error-free attacks required time T and memory M satisfying TM=2 rn , and even if "false negatives" are allowed, no attack could achieve TM < 2 3rn/4. Our new technique yields the first algorithm which never errs and finds all the possible keys with a smaller product of TM, such as T=2 4n time and M=2 n memory for breaking the sequential execution of r=7 block ciphers. The improvement ratio we obtain increases in an unbounded way as r increases, and if we allow algorithms which can sometimes miss solutions, we can get even better tradeoffs by combining our dissection technique with parallel collision search. To demonstrate the generality of the new dissection technique, we show how to use it in a generic way in order to attack hash functions with a rebound attack, to solve hard knapsack problems, and to find the shortest solution to a generalized version of Rubik's cube with better time complexities (for small memory complexities) than the best previously known algorithms. © 2012 International Association for Cryptologic Research.
CITATION STYLE
Dinur, I., Dunkelman, O., Keller, N., & Shamir, A. (2012). Efficient dissection of composite problems, with applications to cryptanalysis, knapsacks, and combinatorial search problems. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7417 LNCS, pp. 719–740). https://doi.org/10.1007/978-3-642-32009-5_42
Mendeley helps you to discover research relevant for your work.