Automatic Generation and Evaluation of Chinese Classical Poetry with Attention-Based Deep Neural Network

4Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

The computer generation of poetry has been studied for more than a decade. Generating poetry on a human level is still a great challenge for the computer-generation process. We present a novel Transformer-XL based on a classical Chinese poetry model that employs a multi-head self-attention mechanism to capture the deeper multiple relationships among Chinese characters. Furthermore, we utilized the segment-level recurrence mechanism to learn longer-term dependency and overcome the context fragmentation problem. To automatically assess the quality of the generated poems, we also built a novel automatic evaluation model that contains a BERT-based module for checking the fluency of sentences and a tone-checker module to evaluate the tone pattern of poems. The poems generated using our model obtained an average score of 9.7 for fluency and 10.0 for tone pattern. Moreover, we visualized the attention mechanism, and it showed that our model learned the tone-pattern rules. All experiment results demonstrate that our poetry generation model can generate high-quality poems.

Cite

CITATION STYLE

APA

Zhao, J., & Lee, H. J. (2022). Automatic Generation and Evaluation of Chinese Classical Poetry with Attention-Based Deep Neural Network. Applied Sciences (Switzerland), 12(13). https://doi.org/10.3390/app12136497

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free