Improving Factuality of Abstractive Summarization without Sacrificing Summary Quality

7Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

Improving factual consistency of abstractive summarization has been a widely studied topic. However, most of the prior works on training factuality-aware models have ignored the negative effect it has on summary quality. We propose EFACTSUM (i.e., Effective Factual Summarization), a candidate summary generation and ranking technique to improve summary factuality without sacrificing summary quality. We show that using a contrastive learning framework with our refined candidate summaries leads to significant gains on both factuality and similarity-based metrics. Specifically, we propose a ranking strategy in which we effectively combine two metrics, thereby preventing any conflict during training. Models trained using our approach show up to 6 points of absolute improvement over the base model with respect to FactCC on XSUM and 11 points on CNN/DM, without negatively affecting either similarity-based metrics or absractiveness.

References Powered by Scopus

On faithfulness and factuality in abstractive summarization

671Citations
N/AReaders
Get full text

Understanding Factuality in Abstractive Summarization with FRANK: A Benchmark for Factuality Metrics

198Citations
N/AReaders
Get full text

BRIO: Bringing Order to Abstractive Summarization

191Citations
N/AReaders
Get full text

Cited by Powered by Scopus

MultiSum: A Multi-Facet Approach for Extractive Social Summarization Utilizing Semantic and Sociological Relationships

2Citations
N/AReaders
Get full text

Factuality Guided Diffusion-Based Abstractive Summarization

1Citations
N/AReaders
Get full text

Improving Consistency for Text Summarization with Energy Functions

1Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Dixit, T., Wang, F., & Chen, M. (2023). Improving Factuality of Abstractive Summarization without Sacrificing Summary Quality. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 902–913). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-short.78

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 5

56%

Lecturer / Post doc 2

22%

Researcher 2

22%

Readers' Discipline

Tooltip

Computer Science 9

75%

Medicine and Dentistry 1

8%

Business, Management and Accounting 1

8%

Mathematics 1

8%

Save time finding and organizing research with Mendeley

Sign up for free