We propose and evaluate a question-answering system that uses decomposed prompting to classify and answer student questions on a course discussion board. Our system uses a large language model (LLM) to classify questions into one of four types: conceptual, homework, logistics, and not answerable. This enables us to employ a different strategy for answering questions that fall under different types. Using a variant of GPT-3, we achieve 81% classification accuracy. We discuss our system’s performance on answering conceptual questions from a machine learning course and various failure modes.
CITATION STYLE
Jaipersaud, B., Zhang, P., Ba, J., Petersen, A., Zhang, L., & Zhang, M. R. (2023). Decomposed Prompting to Answer Questions on a Course Discussion Board. In Communications in Computer and Information Science (Vol. 1831 CCIS, pp. 218–223). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-36336-8_33
Mendeley helps you to discover research relevant for your work.