TopiQAL: Topic-aware Question Answering using Scalable Domain-specific Supercomputers

4Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We all have questions. About today's temperature, scores of our favorite baseball team, the Universe, and about vaccine for COVID-19. Life, physical, and natural scientists have been trying to find answers to various topics using scientific methods and experiments, while computer scientists have built language models as a tiny step towards automatically answering all of these questions across domains given a little bit of context. In this paper, we propose an architecture using state-of-the-art Natural Language Processing language models namely Topic Models and Bidirectional Encoder Representations from Transformers (BERT) that can transparently and automatically retrieve articles of relevance to questions across domains, and fetch answers to topical questions related to COVID-19 current and historical medical research literature. We demonstrate the benefits of using domain-specific supercomputers like Tensor Processing Units (TPUs), residing on cloud-based infrastructure, using which we could achieve significant gains in training and inference times, also with very minimal cost.

Cite

CITATION STYLE

APA

Venkataram, H. S., Mattmann, C. A., & Penberthy, S. (2020). TopiQAL: Topic-aware Question Answering using Scalable Domain-specific Supercomputers. In Proceedings of DLS 2020: Deep Learning on Supercomputers, Held in conjunction with SC 2020: The International Conference for High Performance Computing, Networking, Storage and Analysis (pp. 48–55). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/DLS51937.2020.00011

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free