Jump to content

Coding in large Excel spreadsheet is prohibitively slow. Alternatives?


Recommended Posts

Hi everyone,

Our project involves qualitative coding of 6 open-ended questions at the end of a long survey of quantitative fields. There are >200 columns of quantitative data in the spreadsheet that we want to keep for attributes (e.g. demographic info) and to crosstab by later on in matrices. There are hundreds of observations in this dataset, which was imported from an Excel spreadsheet.

Trying to code the open-response fields in this large spreadsheet is very laggy and we're looking for better solutions. Would it be possible to keep the quantitative attributes in a separate imported file from the coded fields and merge the sources by a linking ID number after the coding is done? Or, I know it's possible to have each observation as its own source and use the spreadsheet as a classification table not to be coded - but then would each of the hundreds of sources/observations have to be manually linked to each row in the classification table? If so, that also sounds too time-intensive to be feasible.

Are there any workarounds that you know of? What would your recommendations be?

Thanks in advance!

Link to post
Share on other sites

Hi,

Yes, it is possible to keep the codable data separate and later link attributes via a Classification Sheet. You can also import individual sources and use a classification sheet to link the attributes automatically after you have coded from imported sources.

Please refer to detailed information about Classification Sheets in our help link below. 

 http://help-nv11.qsrinternational.com/desktop/procedures/import_(or_export)_classification_sheets.htm

Kind Regards,

Bhupesh

Link to post
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...