Assessing and documenting general competencies in otolaryngology resident training programs

13Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.
Get full text

Abstract

OBJECTIVES: The objectives of this study were to: 1) implement web-based instruments for assessing and documenting the general competencies of otolaryngology resident education, as outlined by the Accreditation Council of Graduate Medical Education (ACGME); and 2) examine the benefit and validity of this online system for measuring educational outcomes and for identifying insufficiencies in the training program as they occur. METHODS: We developed an online assessment system for a surgical postgraduate education program and examined its feasibility, usability, and validity. Evaluations of behaviors, skills, and attitudes of 26 residents were completed online by faculty, peers, and nonphysician professionals during a 3-year period. Analyses included calculation and evaluation of total average performance scores of each resident by different evaluators. Evaluations were also compared with American Board of Otolaryngology-administered in-service examination (ISE) scores for each resident. Convergent validity was examined statistically by comparing ratings among the different evaluator types. RESULTS: Questionnaires and software were found to be simple to use and efficient in collecting essential information. From July 2002 to June 2005, 1,336 evaluation forms were available for analysis. The average score assigned by faculty was 4.31, significantly lower than that by nonphysician professionals (4.66) and residents evaluating peers (4.63) (P < .001), whereas scores were similar between nonphysician professionals and resident peers. Average scores between faculty and nonphysician groups showed correlation in constructs of communication and relationship with patients, but not in those of professionalism and documentation. Correlation was observed in respect for patients but not in medical knowledge between faculty and resident peer groups. Resident ISE scores improved in the third year of the study and demonstrated high correlation with faculty perceptions of medical knowledge (r = 0.65, P = .007). CONCLUSIONS: Compliance for completion of forms was 97%. The system facilitated the educational management of our training program along multiple dimensions. The small perceptual differences among a highly selected group of residents have made the unambiguous validation of the system challenging. The instruments and approach warrant further study. Improvements are likely best achieved in broad consultation among other otolaryngology programs. © 2006 The American Laryngological, Rhinological and Otological Society, Inc.

Cite

CITATION STYLE

APA

Roark, R. M., Schaefer, S. D., Yu, G. P., Branovan, D. I., Peterson, S. J., & Lee, W. N. (2006). Assessing and documenting general competencies in otolaryngology resident training programs. Laryngoscope, 116(5), 682–695. https://doi.org/10.1097/01.mlg.0000205148.14269.09

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free