#StackBounty: #agreement-statistics #cohens-kappa Inter-rater agreement after rater 1-based sampling

Bounty: 50

Rater 1 rated (+/-) some cases. Rater 2 gets to rate all the Rater 1 + cases, but only half of the rater 1 – cases. Is there a way to account for this sampling scheme when calculating inter-rater agreement (e.g. Cohen’s kappa)?


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.