The QUEST for quality online health information: Validation of a short quantitative tool

55Citations
Citations of this article
92Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Background: Online health information is unregulated and can be of highly variable quality. There is currently no singular quantitative tool that has undergone a validation process, can be used for a broad range of health information, and strikes a balance between ease of use, concision and comprehensiveness. To address this gap, we developed the QUality Evaluation Scoring Tool (QUEST). Here we report on the analysis of the reliability and validity of the QUEST in assessing the quality of online health information. Methods: The QUEST and three existing tools designed to measure the quality of online health information were applied to two randomized samples of articles containing information about the treatment (n = 16) and prevention (n = 29) of Alzheimer disease as a sample health condition. Inter-rater reliability was assessed using a weighted Cohen's kappa (κ) for each item of the QUEST. To compare the quality scores generated by each pair of tools, convergent validity was measured using Kendall's tau (τ) ranked correlation. Results: The QUEST demonstrated high levels of inter-rater reliability for the seven quality items included in the tool (κ ranging from 0.7387 to 1.0, P

Cite

CITATION STYLE

APA

Robillard, J. M., Jun, J. H., Lai, J. A., & Feng, T. L. (2018). The QUEST for quality online health information: Validation of a short quantitative tool. BMC Medical Informatics and Decision Making, 18(1). https://doi.org/10.1186/s12911-018-0668-9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free