The importance of feedback as an aid to selfassessment is widely acknowledged. A common form of feedback that is used widely in elearning is the use of model answers. However, model answers are deficient in many respects. In particular, the notion of a model answer implies the existence of a single correct answer applicable across multiple contexts with no scope for permissible variation. This reductive assumption is rarely the case with complex problems that are supposed to test students higherorder learning. Nevertheless, the challenge remains of how to support students as they assess their own performance using model answers and other forms of nonverificational feedback. To explore this challenge, the research investigated a management development elearning application and investigated the effectiveness of model answers that followed problembased questions. The research was exploratory, using semistructured interviews with 29 adult learners employed in a global organisation. Given interviewees generally negative perceptions of the modelanswers, they were asked to describe their ideal form of selfassessment materials, and to evaluate nine alternative designs. The results suggest that, as support for higherorder learning, selfassessment materials that merely present an idealised model answer are inadequate. As alternatives, learners preferred materials that helped them understand what behaviours to avoid (and not just do), how to think through the problem (i.e. critical thinking skills), and the key issues that provide a framework for thinking. These findings have broader relevance within higher education, particularly in postgraduate programmes for business students where the importance of prior business experience is emphasised and the profile of students is similar to that of the participants in this research.
Handley, K., & Cox, B. (2007). Beyond model answers: learners’ perceptions of self-assessment materials in e-learning applications. ALT-J, 15(1), 21–36. https://doi.org/10.1080/09687760601129539