Inter-rater agreement in the scoring of abstracts submitted to a primary care research conferenceReportar como inadecuado




Inter-rater agreement in the scoring of abstracts submitted to a primary care research conference - Descarga este documento en PDF. Documentación en PDF para descargar gratis. Disponible también para leer online.

BMC Health Services Research

, 2:8

First Online: 26 March 2002Received: 03 December 2001Accepted: 26 March 2002DOI: 10.1186-1472-6963-2-8

Cite this article as: Montgomery, A.A., Graham, A., Evans, P.H. et al. BMC Health Serv Res 2002 2: 8. doi:10.1186-1472-6963-2-8

Abstract

BackgroundChecklists for peer review aim to guide referees when assessing the quality of papers, but little evidence exists on the extent to which referees agree when evaluating the same paper. The aim of this study was to investigate agreement on dimensions of a checklist between two referees when evaluating abstracts submitted for a primary care conference.

MethodsAnonymised abstracts were scored using a structured assessment comprising seven categories. Between one poor and four excellent marks were awarded for each category, giving a maximum possible score of 28 marks. Every abstract was assessed independently by two referees and agreement measured using intraclass correlation coefficients. Mean total scores of abstracts accepted and rejected for the meeting were compared using an unpaired t test.

ResultsOf 52 abstracts, agreement between reviewers was greater for three components relating to study design adjusted intraclass correlation coefficients 0.40 to 0.45 compared to four components relating to more subjective elements such as the importance of the study and likelihood of provoking discussion 0.01 to 0.25. Mean score for accepted abstracts was significantly greater than those that were rejected 17.4 versus 14.6, 95% CI for difference 1.3 to 4.1, p = 0.0003.

ConclusionsThe findings suggest that inclusion of subjective components in a review checklist may result in greater disagreement between reviewers. However in terms of overall quality scores, abstracts accepted for the meeting were rated significantly higher than those that were rejected.

Electronic supplementary materialThe online version of this article doi:10.1186-1472-6963-2-8 contains supplementary material, which is available to authorized users.

Download fulltext PDF



Autor: Alan A Montgomery - Anna Graham - Philip H Evans - Tom Fahey

Fuente: https://link.springer.com/







Documentos relacionados