Background: Calls for the reform of education in science, technology, engineering, and mathematics (STEM) have inspired many instructional innovations, some research based. Yet adoption of such instruction has been slow. Research has suggested that students' response may significantly affect an instructor's willingness to adopt different types of instruction. Purpose: We created the Student Response to Instructional Practices (StRIP) instrument to measure the effects of several variables on student response to instructional practices. We discuss the step-by-step process for creating this instrument. Design/Method: The development process had six steps: item generation and construct development, validity testing, implementation, exploratory factor analysis, confirmatory factor analysis, and instrument modification and replication. We discuss pilot testing of the initial instrument, construct development, and validation using exploratory and confirmatory factor analyses. Results: This process produced 47 items measuring three parts of our framework. Types of instruction separated into four factors (interactive, constructive, active, and passive); strategies for using in-class activities into two factors (explanation and facilitation); and student responses to instruction into five factors (value, positivity, participation, distraction, and evaluation). Conclusions: We describe the design process and final results for our instrument, a useful tool for understanding the relationship between type of instruction and students' response.
CITATION STYLE
DeMonbrun, M., Finelli, C. J., Prince, M., Borrego, M., Shekhar, P., Henderson, C., & Waters, C. (2017). Creating an Instrument to Measure Student Response to Instructional Practices. Journal of Engineering Education, 106(2), 273–298. https://doi.org/10.1002/jee.20162
Mendeley helps you to discover research relevant for your work.