In recent years, many methods for machine learning on tabular data were introduced that use either factorization machines, neural networks or both. This created a great variety of methods making it non-obvious which method should be used in practice. We begin by extending the previously established theoretical connection between polynomial neural networks and factorization machines (FM) to recently introduced FM techniques. This allows us to propose a single neural-network-based framework that can switch between the deep learning and FM paradigms by a simple change of an activation function. We further show that an activation function exists which can adaptively learn to select the optimal paradigm. Another key element in our framework is its ability to learn high-dimensional embeddings by low-rank factorization. Our framework can handle numeric and categorical data as well as multiclass outputs. Extensive empirical experiments verify our analytical claims. Source code is available at https://github.com/ChenAlmagor/FiFa
CITATION STYLE
Almagor, C., & Hoshen, Y. (2022). You Say Factorization Machine, I Say Neural Network-It’s All in the Activation. In RecSys 2022 - Proceedings of the 16th ACM Conference on Recommender Systems (pp. 389–398). Association for Computing Machinery, Inc. https://doi.org/10.1145/3523227.3551499
Mendeley helps you to discover research relevant for your work.