DiffusionBERT: Improving Generative Masked Language Models with Diffusion Models

24Citations
Citations of this article
93Readers
Mendeley users who have this article in their library.

Abstract

We present DiffusionBERT, a new generative masked language model based on discrete diffusion models. Diffusion models and many pre-trained language models have a shared training objective, i.e., denoising, making it possible to combine the two powerful models and enjoy the best of both worlds. On the one hand, diffusion models offer a promising training strategy that helps improve the generation quality. On the other hand, pre-trained denoising language models (e.g., BERT) can be used as a good initialization that accelerates convergence. We explore training BERT to learn the reverse process of a discrete diffusion process with an absorbing state and elucidate several designs to improve it. First, we propose a new noise schedule for the forward diffusion process that controls the degree of noise added at each step based on the information of each token. Second, we investigate several designs of incorporating the time step into BERT. Experiments on unconditional text generation demonstrate that DiffusionBERT achieves significant improvement over existing diffusion models for text (e.g., D3PM and Diffusion-LM) and previous generative masked language models in terms of perplexity and BLEU score. Promising results in conditional generation tasks show that DiffusionBERT can generate texts of comparable quality and more diverse than a series of established baselines.

References Powered by Scopus

High-Resolution Image Synthesis with Latent Diffusion Models

7678Citations
4480Readers
Get full text

The Power of Scale for Parameter-Efficient Prompt Tuning

1564Citations
1546Readers

Pre-trained models for natural language processing: A survey

1067Citations
2061Readers
Get full text

Cited by Powered by Scopus

Get full text

Draw Step by Step: Reconstructing CAD Construction Sequences from Point Clouds via Multimodal Diffusion

3Citations
N/AReaders
Get full text
3Citations
14Readers
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

He, Z., Sun, T., Tang, Q., Wang, K., Huang, X., & Qiu, X. (2023). DiffusionBERT: Improving Generative Masked Language Models with Diffusion Models. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 4521–4534). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.248

Readers over time

‘22‘23‘24‘25015304560

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 19

58%

Researcher 11

33%

Professor / Associate Prof. 2

6%

Lecturer / Post doc 1

3%

Readers' Discipline

Tooltip

Computer Science 36

84%

Engineering 3

7%

Chemistry 2

5%

Physics and Astronomy 2

5%

Save time finding and organizing research with Mendeley

Sign up for free
0