Decomposed Prompting to Answer Questions on a Course Discussion Board

3Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose and evaluate a question-answering system that uses decomposed prompting to classify and answer student questions on a course discussion board. Our system uses a large language model (LLM) to classify questions into one of four types: conceptual, homework, logistics, and not answerable. This enables us to employ a different strategy for answering questions that fall under different types. Using a variant of GPT-3, we achieve 81% classification accuracy. We discuss our system’s performance on answering conceptual questions from a machine learning course and various failure modes.

Cite

CITATION STYLE

APA

Jaipersaud, B., Zhang, P., Ba, J., Petersen, A., Zhang, L., & Zhang, M. R. (2023). Decomposed Prompting to Answer Questions on a Course Discussion Board. In Communications in Computer and Information Science (Vol. 1831 CCIS, pp. 218–223). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-36336-8_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free