BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection

63Citations
Citations of this article
39Readers
Mendeley users who have this article in their library.
Get full text

Abstract

As various forms of fraud proliferate on Ethereum, it is imperative to safeguard against these malicious activities to protect susceptible users from being victimized. While current studies solely rely on graph-based fraud detection approaches, it is argued that they may not be well-suited for dealing with highly repetitive, skew-distributed and heterogeneous Ethereum transactions. To address these challenges, we propose BERT4ETH, a universal pre-trained Transformer encoder that serves as an account representation extractor for detecting various fraud behaviors on Ethereum. BERT4ETH features the superior modeling capability of Transformer to capture the dynamic sequential patterns inherent in Ethereum transactions, and addresses the challenges of pre-training a BERT model for Ethereum with three practical and effective strategies, namely repetitiveness reduction, skew alleviation and heterogeneity modeling. Our empirical evaluation demonstrates that BERT4ETH outperforms state-of-the-art methods with significant enhancements in terms of the phishing account detection and de-anonymization tasks. The code for BERT4ETH is available at: https://github.com/git-disl/BERT4ETH.

Cite

CITATION STYLE

APA

Hu, S., Zhang, Z., Luo, B., Lu, S., He, B., & Liu, L. (2023). BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection. In ACM Web Conference 2023 - Proceedings of the World Wide Web Conference, WWW 2023 (pp. 2189–2197). Association for Computing Machinery, Inc. https://doi.org/10.1145/3543507.3583345

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free