The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability - Joseph L. Fleiss, Jacob Cohen, 1973
![Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters](https://www.mdpi.com/symmetry/symmetry-14-00262/article_deploy/html/images/symmetry-14-00262-g001.png)
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Intra and Interobserver Reliability and Agreement of Semiquantitative Vertebral Fracture Assessment on Chest Computed Tomography | PLOS ONE
![Inter-observer agreement and reliability assessment for observational studies of clinical work - ScienceDirect Inter-observer agreement and reliability assessment for observational studies of clinical work - ScienceDirect](https://ars.els-cdn.com/content/image/1-s2.0-S1532046419302369-ga1.jpg)
Inter-observer agreement and reliability assessment for observational studies of clinical work - ScienceDirect
![Cohen's kappa in SPSS Statistics - Procedure, output and interpretation of the output using a relevant example | Laerd Statistics Cohen's kappa in SPSS Statistics - Procedure, output and interpretation of the output using a relevant example | Laerd Statistics](https://statistics.laerd.com/spss-tutorials/img/ck/cohens-kappa-crosstabulation-spss-style-v27.png)
Cohen's kappa in SPSS Statistics - Procedure, output and interpretation of the output using a relevant example | Laerd Statistics
![Summary measures of agreement and association between many raters' ordinal classifications - ScienceDirect Summary measures of agreement and association between many raters' ordinal classifications - ScienceDirect](https://ars.els-cdn.com/content/image/1-s2.0-S1047279717303447-gr1.jpg)