Abstract
Memories contribute significantly to the overall power, performance and area (PPA) of modern integrated electronic systems. Owing to their regular structure, memories are generated by memory compilers in modern industrial designs. Although such compilers provide PPA-efficient and silicon-verified layouts, the large and growing number of input parameters to the compilers themselves results in a new challenge of compiler parameter selection given design requirements. The dimensionality of the search space as well as the count of memories prohibit manual tuning in fast-paced design cycles. To efficiently select optimal compiler parameters, we devise regression neural networks as PPA models of memory compilers, based on which an optimal parameterization can be selected. Highly accurate PPA estimates are a prerequisite to a reliable optimization. While regression with multiple targets can easily be achieved by neural networks with multiple output units, model accuracy depends highly on architecture and hyperparameters. We study how neural network prediction error on multi-target regression problems can be reduced, validating recent findings that partial parameter sharing is beneficial to this class of problems. Our real-world application confirms the benefits of partial sharing for multi-target regression, and asserts the applicability to the sigmoid activation function. The accuracy of memory compiler PPA prediction is improved by approximately ten percent on average, decreasing worst-case prediction errors by over 50 percent.
Cite
CITATION STYLE
Last, F., & Schlichtmann, U. (2020). Partial sharing neural networks for multi-target regression on power and performance of embedded memories. In MLCAD 2020 - Proceedings of the 2020 ACM/IEEE Workshop on Machine Learning for CAD (pp. 123–128). Association for Computing Machinery, Inc. https://doi.org/10.1145/3380446.3430642
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.