Jump to content


  • Content Count

  • Joined

  • Last visited

Community Reputation

0 Neutral

About Heather_McPhail

  • Rank
    Casual Member

Profile Information

  • Location
  • Research Interests
    Public health and social media
  • Organization
    Western University
  • Job Title
    Research Associate
  1. @QSRSupport Thanks for your reply. Would there be any way to get a list of the row ids that have a certain code other than doing it manually?
  2. @canarik @QSRSupport Thanks for getting back to me... though I wish there was a way...
  3. I have already coded approximately 6000 records and am now realizing that I should have made another field codable. Is there any way to turn a classifying field into a codable field after importing? It seems like something that should be straightforward. I did see an older question that was similar and the answer was 'no' but I'm hoping that in 2021 this functionality exists. Thanks for any help you can offer, Heather
  4. Hi @canarik, thank you for your prompt reply! I had tried doing a coding query but I only ended up with the two of my columns (the two that actually had parts coded to the node) in the dataset tab there. Is a way to make sure all of the columns are included?
  5. I'm having this problem as well. I'm wondering if 5 years later this functionality exists?
  6. Hi, I'm very new to using Nvivo and I'm struggling to figure out how to get my coded data out so I can feed it into another program. What I would like to do is export my dataset file with all the columns, but only include the rows that have been coded to a certain node. Is this possible? Another option that would work well for me would be to export my dataset with all its columns, but add a column for each of my nodes. Is that possible? I saw a similar question from 2017 and at that time it wasn't possible but I'm wondering if it is possible now. I have tried using the expo
  • Create New...

Important Information

Privacy Policyhttps://www.qsrinternational.com/privacy-policyhis site, you agree to our Terms of Use.