SciELO - Brasil - Concordância entre avaliadores na seleção de artigos em revisões sistemáticas Concordância entre avaliadores na seleção de artigos em revisões sistemáticas
Concordancia intra- e interevaluadores
PDF) Measuring agreement of administrative data with chart data using prevalence unadjusted and adjusted kappa
PDF) A Formal Proof of a Paradox Associated with Cohen's Kappa
JCM | Free Full-Text | Interobserver and Intertest Agreement in Telemedicine Glaucoma Screening with Optic Disk Photos and Optical Coherence Tomography
La –/d/ final en el atlas dialectal de Madrid (ADIM): un cambio en marcha
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Results of ICC and weighted kappa statistics for the three panels of... | Download Table
PDF) Análisis comparativo de tres enfoques para evaluar el acuerdo entre observadores [Comparative analysis of three approaches for rater agreement]
PDF) A Simplified Cohen's Kappa for Use in Binary Classification Data Annotation Tasks
La –/d/ final en el atlas dialectal de Madrid (ADIM): un cambio en marcha
Desarrollo y estudio piloto de un conjunto esencial de indicadores para los servicios de cirugía general | Cirugía Española
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Q-Coh: A tool to screen the methodological quality of cohort studies in systematic reviews and meta-analyses | International Journal of Clinical and Health Psychology
PDF) Explaining the unsuitability of the kappa coefficient in the assessment and comparison of the accuracy of thematic maps obtained by image classification (2020) | Giles M. Foody | 87 Citations
PDF] More than Just the Kappa Coefficient: A Program to Fully Characterize Inter-Rater Reliability between Two Raters | Semantic Scholar
PDF) [Reliability and validity of a generic job exposure matrix applied on a small-business]
PDF) Procedimientos para detectar y medir el sesgo entre observadores
On population-based measures of agreement for binary classifications
SciELO - Brasil - Confiabilidade interobservadores na classificação de pares formados no relacionamento probabilístico entre bases de dados do SISMAMA Confiabilidade interobservadores na classificação de pares formados no relacionamento probabilístico ...
PDF) Relationships of Cohen's Kappa, Sensitivity, and Specificity for Unbiased Annotations
REPRODUCTIBILITAT INTRAOBSERVADOR EN LA TESI DOCTORAL ANÀLISI METODOLÒGIC DELS ESTUDIS DE CONCORDANÇA INTEROBSERVADOR I CARAC
PDF) A New Interpretation of the Weighted Kappa Coefficients
Kappa statistic | CMAJ
PDF] The kappa statistic in reliability studies: use, interpretation, and sample size requirements. | Semantic Scholar
Q-Coh: A tool to screen the methodological quality of cohort studies in systematic reviews and meta-analyses | International Journal of Clinical and Health Psychology