Jump to content

Recommended Posts

Hello

 

I'm stuck (yet again) on using NV function. I set up an intercoder test and had 2 coders code 48 images(sources) to 27 codes(nodes) and I'm trying to deal with the results of the coding comparison which NV only calculates for every source against every node, resulting in a spreadsheet with 1296 lines of comparison.

 

Exporting the results into excel to try and calculate averages also yields nothing as it continues to return:

 

"ErrorEvaluation of function AVERAGE caused a divide by zero error."

 

I can see formulas [=IF($Q30-$O30=0,1,($P30-$O30)/($Q30-O30))] in the NV example: http://redirect.qsrinternational.com/examples-coding-comparison-nv10-en.htm, however not using Excel very often or being terribly proficient with formulas I'm not sure how to setup and apply a formula to my spreadsheet columns to achieve salient results of Kappa and % agreement? All sources are to be weighted equally.

 

Another issue is when looking at the node reference counts I also can;t see which coder applied which codes to what images. In node view it only shows who coded(modified) what last. I assigned colours to each coder and they also do not show up.

 

Any help would really be appreciated, I'm terribly behind in my work ...

 

Thanks!

Share this post


Link to post
Share on other sites
Hi Scott,


The ErrorEvaluation of function AVERAGE caused a divide by zero error is usually a result of the data being referenced by the formula being incorrect. You will need to modify them to accommodate your data. So, as an example, in the example sheet, the Average for node "Community\Community change" (unweighted), the cells are using the

data from cells in rows 3-5. If your data is in a different row you will need to modify the formulas to reflect this, which is probably why you are getting the div/0 error.



I hope this helps.



Cheers



Simon

Share this post


Link to post
Share on other sites

Sorry but I'm still not getting this. I attempted to average only a single column (Kappa) to see the average number and it turns up that error, what is/are the formula used to calculate the Kappa and percentage aggregates?

 

I mean if I cut that column of Kappa and and paste it here: http://www.calculatorsoup.com/calculators/statistics/average.php It gives me:

 

Average (Mean):

Count: 1296 Sum: 1256 Average: 1256 / 1296 = 0.96913580246914

 

Would this be a correct calculation of average Kappa between my coders? And since NV compares all sources against all nodes is this still an accurate evaluation of inter-coder credibility?

 

Thanks again.

Share this post


Link to post
Share on other sites

OK, having gotten no responses and trying for hours to get a side-by-side comparison of the coders' work with no luck I found a Kappa Calculator here: https://www.niwa.co.nz/node/104318/kapparesults

 

 

In order to input the numbers to achieve Kappa I took the following values from the NV10 coding comparison results:

 

Present-Present: Took the total number of A+B Agreement = 49
Present-Absent: Took the total # of A's but NOT B's = 31 (Coder 1)
Absent-Present: Took the total # of B's but NOT A's = 9 (Coder 2)
Absent-Absent: Took the total # of Neither A nor B = 1207
And here are the results from the calculator:
Kappa Results Results for 2x2 Interrater table Rater A Rater B present absent present 49 9 absent 31 1207 estimated kappa = 0.6942 s.e.(0) = 0.0273, s.e.(estimated kappa) = 0.0455 Hypothesis test p-values One-sided test, H0 is kappa ≤ 0.5

p = Prob[>estimated kappa, given that kappa= 0.5]

 

 

Could someone please confirm if this is accurate?

 

Thanks!

Share this post


Link to post
Share on other sites

Hi Scott,

 

Firstly, here's how to visually compare the coding from different users in your project. Once you have run a Coding Comparison Query, you can right-click on any single row in the query results, and select 'Open Source'. NVivo will open the image and show coding stripes for the node represented in that row, including coding stripes for each user in your comparison query. I've attached an image that shows an example of these stripes. In NVivo, if you click on any one of the stripes, the associated region of the image will be highlighted.

Secondly, from your reported query results your Kappa result of 0.6942 looks correct, although our approach to calculating it would have been different.

 

Following the example in the spreadsheet located at http://redirect.qsrinternational.com/examples-coding-comparison-nv10-en.htm, I would start with the average of each Agreement and Disagreement detail columns, rather than the totals.

Deriving the averages from your totals I get:

 

Agreement Values Total Average %
Present-Present (A+B Agreement) 49 3.780864
Present-Absent (A but Not B ) 31 2.391975
Absent-Present (B but Not A) 9 0.694444
Absent-Absent (Neither A nor B ) 1207 93.13272

However, these two approaches are equivalent in their outcome. You can calculate the Kappa from either set of these values and achieve the same score.

I verified this by:

  1. Putting my calculated averages into the Kappa calculator you specified at https://www.niwa.co.nz/node/104318/kappa, instead of the totals. I confirm that I get the same Kappa score.
  2. I also calculated Kappa independently, step by step, using Cohen's Kappa formula as detailed at https://en.wikipedia.org/wiki/Cohen . I did this twice - with your totals and my averages - and on both occassions I got the same Kappa score.

This score of 0.6942 represents the Kappa coefficient averaged across all nodes and sources in your original query, where each source is considered to have the same weight.

Regards,

Kate

image with user coding stripes.png

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×