Background: Health professionals sometimes do not use the best evidence to treat their patients, in part due to unconscious acts of omission and information overload. Reminders help clinicians overcome these problems by prompting them to recall information that they already know, or by presenting information in a different and more accessible format. Manually-generated reminders delivered on paper are defined as information given to the health professional with each patient or encounter, provided on paper, in which no computer is involved in the production or delivery of the reminder. Manually-generated reminders delivered on paper are relatively cheap interventions, and are especially relevant in settings where electronic clinical records are not widely available and affordable. This review is one of three Cochrane Reviews focused on the effectiveness of reminders in health care. Objectives: 1. To determine the effectiveness of manually-generated reminders delivered on paper in changing professional practice and improving patient outcomes. 2. To explore whether a number of potential effect modifiers influence the effectiveness of manually-generated reminders delivered on paper. Search methods: We searched CENTRAL, MEDLINE, Embase, CINAHL and two trials registers on 5 December 2018. We searched grey literature, screened individual journals, conference proceedings and relevant systematic reviews, and reviewed reference lists and cited references of included studies. Selection criteria: We included randomised and non-randomised trials assessing the impact of manually-generated reminders delivered on paper as a single intervention (compared with usual care) or added to one or more co-interventions as a multicomponent intervention (compared with the co-intervention(s) without the reminder component) on professional practice or patients' outcomes. We also included randomised and non-randomised trials comparing manually-generated reminders with other quality improvement (QI) interventions. Data collection and analysis: Two review authors screened studies for eligibility and abstracted data independently. We extracted the primary outcome as defined by the authors or calculated the median effect size across all reported outcomes in each study. We then calculated the median percentage improvement and interquartile range across the included studies that reported improvement related outcomes, and assessed the certainty of the evidence using the GRADE approach. Main results: We identified 63 studies (41 cluster-randomised trials, 18 individual randomised trials, and four non-randomised trials) that met all inclusion criteria. Fifty-seven studies reported usable data (64 comparisons). The studies were mainly located in North America (42 studies) and the UK (eight studies). Fifty-four studies took place in outpatient/ambulatory settings. The clinical areas most commonly targeted were cardiovascular disease management (11 studies), cancer screening (10 studies) and preventive care (10 studies), and most studies had physicians as their target population (57 studies). General management of a clinical condition (17 studies), test-ordering (14 studies) and prescription (10 studies) were the behaviours more commonly targeted by the intervention. Forty-eight studies reported changes in professional practice measured as dichotomous process adherence outcomes (e.g. compliance with guidelines recommendations), 16 reported those changes measured as continuous process-of-care outcomes (e.g. number of days with catheters), eight reported dichotomous patient outcomes (e.g. mortality rates) and five reported continuous patient outcomes (e.g. mean systolic blood pressure). Manually-generated reminders delivered on paper probably improve professional practice measured as dichotomous process adherence outcomes) compared with usual care (median improvement 8.45% (IQR 2.54% to 20.58%); 39 comparisons, 40,346 participants; moderate certainty of evidence) and may make little or no difference to continuous process-of-care outcomes (8 comparisons, 3263 participants; low certainty of evidence). Adding manually-generated paper reminders to one or more QI co-interventions may slightly improve professional practice measured as dichotomous process adherence outcomes (median improvement 4.24% (IQR −1.09% to 5.50%); 12 comparisons, 25,359 participants; low certainty of evidence) and probably slightly improve professional practice measured as continuous outcomes (median improvement 0.28 (IQR 0.04 to 0.51); 2 comparisons, 12,372 participants; moderate certainty of evidence). Compared with other QI interventions, manually-generated reminders may slightly decrease professional practice measured as process adherence outcomes (median decrease 7.9% (IQR −0.7% to 11%); 14 comparisons, 21,274 participants; low certainty of evidence). We are uncertain whether manually-generated reminders delivered on paper, compared with usual care or with other QI intervention, lead to better or worse patient outcomes (dichotomous or continuous), as the certainty of the evidence is very low (10 studies, 13 comparisons). Reminders added to other QI interventions may make little or no difference to patient outcomes (dichotomous or continuous) compared with the QI alone (2 studies, 2 comparisons). Regarding resource use, studies reported additional costs per additional point of effectiveness gained, but because of the different currencies and years used the relevance of those figures is uncertain. None of the included studies reported outcomes related to harms or adverse effects. Authors' conclusions: Manually-generated reminders delivered on paper as a single intervention probably lead to small to moderate increases in outcomes related to adherence to clinical recommendations, and they could be used as a single QI intervention. It is uncertain whether reminders should be added to other QI intervention already in place in the health system, although the effects may be positive. If other QI interventions, such as patient or computerised reminders, are available, they should be preferred over manually-generated reminders, but under close evaluation in order to decrease uncertainty about their potential effect.
CITATION STYLE
Pantoja, T., Grimshaw, J. M., Colomer, N., Castañon, C., & Leniz Martelli, J. (2019, December 18). Manually-generated reminders delivered on paper: effects on professional practice and patient outcomes. Cochrane Database of Systematic Reviews. John Wiley and Sons Ltd. https://doi.org/10.1002/14651858.CD001174.pub4
Mendeley helps you to discover research relevant for your work.