Jump to content
Knukes

Coding comparison and kappa coefficient

Recommended Posts

Knukes    8

Hi,

 

I'm comparing inter-rater reliability in the coding of a transcript between me and my colleague. I ran a comparison query at all nodes based on sentence calculations. However, the kappa coefficient is negative for almost all nodes -- even ones which we have coded the exact same way. I can manually see that we have each coded the source at that particular node the same way, yet, for example the percent agreement is 96% and the % disagreement is 4% giving a kappa of -0.02. Furthermore, when I check the "show coding comparison content" box, none of the code is green -- even though we have coded at that node in 100% agreement and 0% disagreement.

 

How do I make sense of this comparison?

Thanks.

Share this post


Link to post
Share on other sites
QSRSupport    76

Hello Knukes,

 

Coding comparison takes into account not only the content which was coded by both users, but also the content which was not coded by both users.

 

For example, if the source is a document with 1000 characters, where:

  • 50 of these characters have been coded by both users
  • 150 of these characters have been coded by only one of these users, and
  • the remaining 800 characters have not been coded by either user

then the percentage agreement is calculated as (800 + 50) ÷ 1000 = 85%.

So, although you can see that there is content which both users have coded in the same way, there will be content which has not been coded by either user or has been coded by only 1 user. This will change the % value of agreement as demonstrated the above example.

With regards to the coding comparison content, please check the values in the table, especially the A and B (%) column. You will be able to see the coded content in green only if there is a positive value in this column. So although you may have 96% in the agreement column, the A and B (%) value could still be zero.

 

The values of the columns are calculated using the below method:

  • Agreement Column = sum of columns A and B and Not A and Not B

  • A and B = the percentage of data item content coded to the selected node by both Project User Group A and Project User Group B

  • Not A and Not B = the percentage of data item content coded by neither Project User Group A and Project User Group B

Please refer to the following link for detailed information: http://help-nv10.qsrinternational.com/desktop/procedures/run_a_coding_comparison_query.htm#MiniTOCBookMark9

Kind Regards,

Sameer S

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×