Future Directions and Conclusions

0Citations
Citations of this article
42Readers
Mendeley users who have this article in their library.
Get full text

Abstract

It is quite remarkable that BERT debuted in October 2018, only around three years ago. Taking a step back and reflecting, the field has seen an incredible amount of progress in a short amount of time. As we have noted in the Introduction and demonstrated throughout this book, the foundations of how to apply BERT and other transformer architectures to ranking are already quite sturdy—the improvements in effectiveness attributable to, for example, the simple monoBERT design, are substantial, robust, and have been widely replicated in many tasks. We can confidently assert that the state of the art has significantly advanced over this time span [Lin, 2019], which has been notable in the amount of interest, attention, and activity that transformer architectures have generated. These are exciting times!

Cite

CITATION STYLE

APA

Lin, J., Nogueira, R., & Yates, A. (2022). Future Directions and Conclusions. In Synthesis Lectures on Human Language Technologies (pp. 239–253). Springer Nature. https://doi.org/10.1007/978-3-031-02181-7_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free