The evidence used when making decisions about the design, implementation and evaluation in intervention programs should be methodologically sound. Depending on the context of the intervention, different methodologies may apply. Nonetheless, the intervention context is of-ten unstable and, to adapt to changing circumstances, it sometimes becomes necessary to modify the original plan. The framework proposed herein draws on approaches that can be considered two extremes of a continuum (experimental/quasi-experimental designs and studies based on observational methodology). In unstable intervention context conditions, this enables decisions from a methodological quality approach regarding design, measurement, and analysis. Structural dimensions, i.e., units (participants, users), treatment (program activities), outcomes (results, including decisions about the instruments to use and data gathering), setting (implementation context) and time will be detailed as part of the practical framework. The present study aims to specify the degree of correspondence/complementarity between components in these structural dimensions of a program evaluation from a practical complementarity perspective based on methodological quality.
CITATION STYLE
Chacón-Moscoso, S., Sanduvete-Chaves, S., Lozano-Lozano, J. A., Portell, M., & Teresa Anguera, M. (2021). From randomized control trial to mixed methods: A practical framework for program evaluation based on methodological quality. Anales de Psicologia, 37(3), 599–608. https://doi.org/10.6018/analesps.470021
Mendeley helps you to discover research relevant for your work.