One size doesn’t fit all – effectiveness and subjective evaluations of adaptable information literacy instruction

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The paper examines whether effects of an adaptable information literacy instruction program are associated with (a) adherence to the recommendations of online learning contents derived from a test of prior knowledge and (b) subjective evaluations of the program. An adaptable blended learning training for German psychology students was evaluated in a study with a pretest-posttest design. N = 64 advanced students completed two tests of scholarly information literacy, an information literacy self-efficacy scale, and an evaluation questionnaire. Participants who worked on more online materials than recommended based on their pretest performance did not differ in their gain scores from participants who exactly followed the recommendations. However, both groups outperformed participants who omitted recommended materials. According to subjective evaluations, the latter participants constitute a “risk group” with low subjective acceptance of online teaching which might need additional support during online learning or alternative forms of instruction.

Cite

CITATION STYLE

APA

Mayer, A. K., Peter, J., Leichner, N., & Krampen, G. (2015). One size doesn’t fit all – effectiveness and subjective evaluations of adaptable information literacy instruction. In Communications in Computer and Information Science (Vol. 552, pp. 283–292). Springer Verlag. https://doi.org/10.1007/978-3-319-28197-1_29

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free