intTypePromotion=1
zunia.vn Tuyển sinh 2024 dành cho Gen-Z zunia.vn zunia.vn
ADSENSE

Kappa statistic considerations in evaluating inter-rater reliability between two raters: Which, when and context matters

Chia sẻ: _ _ | Ngày: | Loại File: PDF | Số trang:5

7
lượt xem
2
download
 
  Download Vui lòng tải xuống để xem tài liệu đầy đủ

In research designs that rely on observational ratings provided by two raters, assessing inter-rater reliability (IRR) is a frequently required task. However, some studies fall short in properly utilizing statistical procedures, omitting essential information necessary for interpreting their findings, or inadequately addressing the impact of IRR on subsequent analyses’ statistical power for hypothesis testing.

Chủ đề:
Lưu

Nội dung Text: Kappa statistic considerations in evaluating inter-rater reliability between two raters: Which, when and context matters

ADSENSE

CÓ THỂ BẠN MUỐN DOWNLOAD

 

Đồng bộ tài khoản
2=>2