Missed opportunities in the evaluation of public health interventions: A case study of physical activity programmes

13Citations
Citations of this article
86Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Background: Evidence-based approaches are requisite in evaluating public health programmes. Nowhere are they more necessary than physical activity interventions where evidence of effectiveness is often poor, especially within hard to reach groups. Our study reports on the quality of the evaluation of a government funded walking programme in five 'Walking Cities' in England. Cities were required to undertake a simple but robust evaluation using the Standard Evaluation Framework (SEF) for physical activity interventions to enable high quality, consistent evaluation. Our aim was not to evaluate the outcomes of this programme but to evaluate whether the evaluation process had been effective in generating new and reliable evidence on intervention design and what had worked in 'real world' circumstances. Methods: Funding applications and final reports produced by the funder and the five walking cities were obtained. These totalled 16 documents which were systematically analysed against the 52 criteria in the SEF. Data were cross checked between the documents at the bid and reporting stage with reference to the SEF guidance notes. Results: Generally, the SEF reporting requirements were not followed well. The rationale for the interventions was badly described, the target population was not precisely specified, and neither was the method of recruitment. Demographics of individual participants, including socio-economic status were reported poorly, despite being a key criterion for funding. Conclusions: Our study of the evaluations demonstrated a missed opportunity to confidently establish what worked and what did not work in walking programmes with particular populations. This limited the potential for evidence synthesis and to highlight innovative practice warranting further investigation. Our findings suggest a mandate for evaluability assessment. Used at the planning stage this may have ensured the development of realistic objectives and crucially may have identified innovative practice to implement and evaluate. Logic models may also have helped in the development of the intervention and its means of capturing evidence prior to implementation. It may be that research-practice partnerships between universities and practitioners could enhance this process. A lack of conceptual clarity means that replicability and scaling-up of effective interventions is difficult and the opportunity to learn from failure lost.

Cite

CITATION STYLE

APA

Hanson, S., & Jones, A. (2017). Missed opportunities in the evaluation of public health interventions: A case study of physical activity programmes. BMC Public Health, 17(1). https://doi.org/10.1186/s12889-017-4683-z

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free