![Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters](https://www.mdpi.com/symmetry/symmetry-14-00262/article_deploy/html/images/symmetry-14-00262-g001.png)
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability - Joseph L. Fleiss, Jacob Cohen, 1973
![PDF] Sample size determination and power analysis for modified Cohen's kappa statistic | Semantic Scholar PDF] Sample size determination and power analysis for modified Cohen's kappa statistic | Semantic Scholar](https://d3i71xaburhd42.cloudfront.net/e51bec1b939791a897b3153430133087c0d30eb1/10-Table3-1.png)
PDF] Sample size determination and power analysis for modified Cohen's kappa statistic | Semantic Scholar
![Testing the normal approximation and minimal sample size requirements of weighted kappa when the number of categories is large Testing the normal approximation and minimal sample size requirements of weighted kappa when the number of categories is large](https://conservancy.umn.edu/bitstream/handle/11299/100360/v05n1p101.pdf.jpg?sequence=6&isAllowed=y)
Testing the normal approximation and minimal sample size requirements of weighted kappa when the number of categories is large
![Table 3 from Sample Size Requirements for Interval Estimation of the Kappa Statistic for Interobserver Agreement Studies with a Binary Outcome and Multiple Raters | Semantic Scholar Table 3 from Sample Size Requirements for Interval Estimation of the Kappa Statistic for Interobserver Agreement Studies with a Binary Outcome and Multiple Raters | Semantic Scholar](https://d3i71xaburhd42.cloudfront.net/dee23caac4abe57e1817e86949e38fe9bc1cefcf/10-Table3-1.png)
Table 3 from Sample Size Requirements for Interval Estimation of the Kappa Statistic for Interobserver Agreement Studies with a Binary Outcome and Multiple Raters | Semantic Scholar
![Cohen's Kappa and Fleiss' Kappa— How to Measure the Agreement Between Raters | by Audhi Aprilliant | Medium Cohen's Kappa and Fleiss' Kappa— How to Measure the Agreement Between Raters | by Audhi Aprilliant | Medium](https://miro.medium.com/v2/resize:fit:1186/1*pTgitFR4T5yGBFXrd8K6GQ.png)