A JIT Compiler for Neural Network Inference

6Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper describes a C++ library that compiles neural network models at runtime into machine code that performs inference. This approach in general promises to achieve the best performance possible since it is able to integrate statically known properties of the network directly into the code. In our experiments on the NAO V6 platform, it outperforms existing implementations significantly on small networks, while being inferior on large networks. The library was already part of the B-Human code release 2018 [12], but has been extended since and is now available as a standalone version that can be integrated into any C++14 code base [18].

Cite

CITATION STYLE

APA

Thielke, F., & Hasselbring, A. (2019). A JIT Compiler for Neural Network Inference. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11531 LNAI, pp. 448–456). Springer. https://doi.org/10.1007/978-3-030-35699-6_36

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free