Separable Self-Attention Mechanism for Point Cloud Local and Global Feature Modeling

6Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The self-attention mechanism has an excellent ability to capture long-range dependencies of data. To enable the self-attention mechanism in point cloud tasks to focus on both local and global contexts, we design a separable self-attention mechanism for point clouds by decomposing the construction of the attention map of a point cloud into two steps: Intra-patch Attention and Inter-patch Attention, the former computes the attention map of the tokens corresponding to each point in the local patch of the point cloud for mining local fine-grained semantic relationships, while the latter constructs the attention map among all the patches for mining long-distance interaction information. The two self-attention mechanisms work in parallel, focusing on both fine-grained local patterns and considering global scenes. Equipped with Intra-patch Attention and Inter-patch Attention modules, we construct a hierarchical end-to-end point cloud analysis architecture called Separable Transformer and conduct exhaustive experiments to demonstrate that the performance of the network proposed in this paper is highly competitive with state-of-the-art methods.

Cite

CITATION STYLE

APA

Wang, F., Wang, X., Lv, D., Zhou, L., & Shi, G. (2022). Separable Self-Attention Mechanism for Point Cloud Local and Global Feature Modeling. IEEE Access, 10, 129823–129831. https://doi.org/10.1109/ACCESS.2022.3228044

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free