Does it make sense to use written instruments to assess communication skills? Systematic review on the concurrent and predictive value of written assessment for performance
Permanent link
https://hdl.handle.net/10037/32706Date
2022-12-23Type
Journal articleTidsskriftartikkel
Peer reviewed
Author
Kiessling, Claudia; Perron, Noelle Junod; van Nuland, Marc; Bujnowska-Fedak, Maria Magdalena; Essers, Geurt; Joakimsen, Ragnar Martin; Pype, Peter; Tsimtsiou, ZoiAbstract
Methods - Search included four databases for peer-reviewed studies containing both written and performance-based CS assessment. Eleven studies met the inclusion criteria.
Results - Included studies predominantly assessed undergraduate medical students. Studies reported mainly low to medium correlations between written and performance-based assessment results (Objective Structured Clinical Examinations or encounters with simulated patients), and gave correlation coefficients ranging from 0.13 to 0.53 (p < 0.05). Higher correlations were reported when specific CS, like motivational interviewing were assessed. Only a few studies gave sufficient reliability indicators of both assessment formats.
Conclusions - Written assessment scores seem to predict performance-based assessments to a limited extent but cannot replace them entirely. Reporting of assessment instruments’ psychometric properties is essential to improve the interpretation of future findings and could possibly affect their predictive validity for performance.
Practice implications - Within longitudinal CS assessment programs, triangulation of assessment including written assessment is recommended, taking into consideration possible limitations. Written assessments with feedback can help students and trainers to elaborate on procedural knowledge as a strong support for the acquisition and transfer of CS to different contexts.