Top 10 qualitative data analysis methods

Table of contents
Primary Item (H2)Sub Item 1 (H3)Sub Item 2 (H4)
Sub Item 3 (H5)
Sub Item 4 (H6)
Published: 
Jan. 14, 2026

Key takeaways

Qualitative data analysis interprets meaning and patterns in non-numerical data like interviews, documents, and observations. Different methods serve different goals, and the right approach depends on the research question, data type, and analytic aims. Methods can be combined if the process is clearly documented. Qualitative data analysis software (QDA software) such as NVivo and ATLAS.ti supports organization, coding, comparison, and transparency but does not replace analytic decision-making by the researcher.

Qualitative data analysis refers to a set of methods used in qualitative research to interpret non-numerical data or textual data such as interview transcripts, open-ended survey responses, field notes, documents, and multimedia materials. The qualitative research process involves focusing on meaning, context, and patterns in language, interaction, and experience rather than measurement or statistical testing. Researchers use qualitative analysis across disciplines including education, health, social sciences, market research, and policy studies.

There is no single approach that fits all qualitative projects. Each method is shaped by different assumptions about data, research questions, and analytical goals. Some approaches emphasize categorizing content, while others focus on interpretation, theory development, or the structure of language and narratives.

This article outlines ten commonly used qualitative data analysis techniques, explains how they differ, and highlights situations where each approach is typically applied.

What is qualitative data analysis?

The qualitative data analysis process involves systematically examining non-numerical data to identify patterns, categories, relationships, and interpretations. The raw data typically come from interviews, focus groups, observations, documents, audio recordings, images, or open-ended survey responses. Unlike in quantitative analysis, qualitative analysis focuses on how people describe experiences, construct meaning, and use language in specific contexts.

Interpreting qualitative data usually involves several overlapping activities. Qualitative researchers read or view the data multiple times, assign labels or codes to relevant segments, and group those codes into broader categories or themes. Depending on the method, the analysis may also involve comparing cases, tracing sequences of events, examining how ideas change over time, or analyzing how language functions in interaction. Interpretation is guided by the research questions, theoretical perspective, and analytic framework chosen for the study.

Any qualitative data analysis approach can be inductive, where categories emerge from the data, deductive, where existing concepts guide coding, or a combination of both. The outcome is a structured account of patterns and explanations grounded in the data itself.

Looking to turn data into discovery? Discover Lumivero’s qualitative data analysis software, NVivo and ATLAS.ti, that help you advance knowledge with confidence.

Learn more

How to choose the right analysis method

Qualitative research methods vary depending on the research question, the type of data collected, and the purpose of the study. Methods vary in how they treat data, what they prioritize during analysis, and the kind of outcomes they produce. A clear understanding of what the study aims to explain or interpret should guide this decision.

Some methods are better suited for describing patterns across large datasets, while others focus on detailed interpretation of a small number of cases. For example, content analysis and thematic analysis work well when the goal is to identify recurring ideas across interviews or documents. Narrative and discourse analysis are more appropriate when the focus is on how stories are constructed or how language is used in specific social contexts. Grounded theory is typically used when the goal is to develop theory directly from the data rather than apply an existing framework.

Practical considerations also matter. Time constraints, team size, available expertise, and familiarity with specific methods can influence the choice. The nature of the data, such as whether it is highly structured, conversational, or longitudinal, should also be considered. Selecting a method that aligns with both the research goals and the data helps ensure analytic coherence and transparency.

10 common qualitative data analysis methods are:

  • Content analysis
  • Thematic analysis
  • Narrative analysis
  • Discourse analysis
  • Grounded Theory
  • Phenomenological analysis
  • Case study analysis
  • Framework analysis
  • Sentiment analysis
  • Triangulation

Content analysis

Content analysis is used when the goal is to systematically categorize and summarize large volumes of textual or visual data. This method works well when researchers want to draw research findings from patterns, frequencies, or changes in content across cases or over time.

Content analysis is commonly applied in:

  • Studies that examine documents
  • Interview transcripts
  • Open-ended survey responses
  • Media content
  • Policy texts

The qualitative analysis process typically begins by defining categories or coding rules. These categories may be developed inductively from the data or deductively based on prior research or theory. Data segments are then coded according to these categories, allowing researchers to compare how often certain ideas, terms, or topics appear and how they are distributed across sources. Content analysis can be qualitative, focusing on meaning and context, or more structured, emphasizing counts and comparisons.

A common example of content analysis is analyzing interview responses to identify recurring concerns among participants, such as barriers to healthcare access or perceptions of institutional support. It is also frequently used to examine trends in policy documents or news coverage.

Using software such as NVivo or ATLAS.ti can help manage large datasets, apply consistent codes, and retrieve coded segments efficiently. A practical tip is to pilot the coding scheme on a small subset of data and revise categories before coding the full dataset to reduce ambiguity and overlap.

Explore when to choose inductive content analysis and how to apply it step by step in our on-demand webinar, “Inductive Content Analysis for Qualitative Researchers.”

Watch now

Thematic analysis

Thematic analysis is used when the aim is to identify and interpret recurring patterns of meaning across a dataset. It is flexible and can be applied across a wide range of research questions, theoretical perspectives, and data types, including interviews, focus groups, reflective journals, and open-ended survey responses. This method is often chosen for studies that seek to describe shared experiences or perspectives. Unlike content analysis, which emphasizes systematically counting and categorizing predefined codes, thematic analysis focuses on developing and interpreting themes that capture how meaning is constructed across the dataset.

Thematic analysis typically involves:

  • Familiarization with the data
  • Generating initial codes
  • Grouping related codes
  • Defining broader themes

Themes represent patterns that are meaningful in relation to the research questions, not simply topics that appear frequently. The process is iterative, with researchers moving back and forth between data, codes, and themes to check consistency and relevance.

A practical example of thematic analysis includes analyzing qualitative data from interviews with teachers to identify themes related to classroom challenges, instructional strategies, or professional development needs. In health research, thematic analysis is often used to examine patient experiences or service use.

QDA software such as NVivo or ATLAS.ti supports thematic analysis by allowing researchers to code data segments, merge or split codes, and visually map relationships between themes. A useful tip is to document decisions about theme definitions and revisions using memos, which helps maintain transparency and supports later reporting.

Learn how to identify meaningful patterns in qualitative data using thematic analysis in “The Essential Guide to Thematic Analysis.”

Download now

Narrative analysis

Narrative analysis is used when the focus is on how individuals construct and communicate stories about their experiences. This method prioritizes sequence, structure, and perspective rather than breaking data into isolated codes.

Narrative analysis is particularly suitable for studies examining:

  • Life histories
  • Identity development
  • Decision-making processes
  • Events unfolding over time

Analysis involves examining how stories are organized, what events are emphasized or omitted, and how participants position themselves and others within their accounts. Researchers may focus on plot structure, turning points, or the use of language to assign meaning to experiences. Unlike methods that fragment data, narrative analysis often works with longer excerpts or complete accounts.

A common application of narrative analysis is analyzing interviews with patients describing illness trajectories or educators reflecting on career pathways. Narrative analysis is also used in longitudinal studies to examine how accounts change across multiple interviews.

Qualitative analysis software like Lumivero’s research software can support narrative work by linking full transcripts, annotating extended segments, and attaching analytic memos without over-fragmenting the data. A practical tip is to preserve narrative coherence during coding qualitative data by using broad codes or document-level annotations rather than line-by-line coding.

Discourse analysis

Discourse analysis is used when the aim is to examine how language operates within social, institutional, or interactional contexts. The focus is on how meaning is produced through wording, structure, and interaction, rather than on topic frequency or participant attitudes.

Discourse analysis is common in studies of:

  • Classroom talk
  • Policy documents
  • Media texts
  • Recorded conversations

Analysis typically involves detailed transcription or close reading, followed by examination of features such as framing, positioning, turn-taking, modality, or rhetorical patterns. Researchers often work with extended sequences of text to understand how meanings develop across interaction. Coding is usually selective and analytic, with attention to how excerpts relate to broader discursive patterns rather than fitting into fixed categories.

A practical example of discourse analysis includes analyzing classroom transcripts to examine how teachers and students negotiate authority or reviewing policy documents to assess how responsibility or accountability is constructed through language.

In NVivo, researchers can code overlapping segments, link transcript text directly to audio or video, and write analytic memos attached to specific turns or sequences. ATLAS.ti supports discourse analysis through quotation-based coding, network views for mapping discursive relationships, and document-level comments that preserve interactional context. A useful tip is to avoid over-coding and instead focus on well-justified excerpts that illustrate recurring discursive practices.

Grounded theory

Grounded theory is used when the goal is to develop theory directly from qualitative data rather than apply an existing conceptual framework. It is commonly used in studies that seek to explain processes, actions, or interactions, especially in areas where prior theory is limited or fragmented.

Grounded theory is often applied in:

  • Sociology
  • Education
  • Health research
  • Organizational studies

The analysis follows an iterative process that includes open coding, constant comparison, and category development. Researchers code data line by line or incident by incident, compare cases continuously, and refine categories as new data are collected or revisited. As analysis progresses, categories are linked to form a theoretical explanation that accounts for variation in the data. Data collection and analysis often occur in parallel.

A practical example of grounded theory is studying how patients manage chronic illness over time or how early-career professionals adapt to workplace expectations. In both cases, the analysis aims to explain patterns of action rather than describe themes alone.

Lumivero's QDA software supports grounded theory by allowing flexible code creation, easy revision of code structures, and memo writing that documents analytic decisions. Plus, coding comparison tools and case classifications can help track category development across participants. A practical tip is to write analytic memos consistently from the early stages, as these memos form the foundation of the emerging theory.

Learn how to code your data, establish rigor, and report results that reflect the insights and theory you developed through your research in "The Essential Guide to Grounded Theory.”

Download now

Phenomenological analysis

Phenomenological analysis is used when the purpose of a study is to examine how individuals experience a specific phenomenon and how they describe that experience in their own terms. It is commonly applied in health, education, and psychology research, particularly when the focus is on perceptions, emotions, or sense-making rather than behavior or outcomes.

Analysis typically begins with careful reading of interview transcripts or written accounts, followed by identification of significant statements related to the phenomenon. These statements are grouped into meaning units, which are then synthesized into descriptions that reflect shared aspects of experience across participants. Researchers often engage in reflexive practices to acknowledge and bracket prior assumptions during analysis.

A practical example of phenomenology includes analyzing interviews with first-generation university students to understand experiences of academic transition or examining how patients describe living with a long-term condition.

Lumivero's QDA software, NVivo and ATLAS.ti, can support phenomenological analysis by allowing researchers to code substantial excerpts, compare meaning units across cases, and store reflexive notes linked directly to participant accounts. They also support this work through quotation-based coding, memo systems for documenting interpretive decisions, and tools that allow researchers to review entire documents alongside analytic notes. A practical tip is to avoid excessive sub-coding and instead focus on clearly defined meaning units that reflect participants’ descriptions.

Case study analysis

Case study analysis is used when the focus is on developing an in-depth understanding of a bounded case, such as an individual, group, organization, program, or event. This method is appropriate when researchers want to examine complex phenomena within real-world contexts, especially where boundaries between the case and its setting are not clearly defined.

Analysis involves organizing and integrating multiple data sources, which may include interviews, observations, documents, and artifacts. Researchers often analyze data within each case first, followed by cross-case comparison when multiple cases are included. Coding may focus on themes, processes, or explanatory factors that help account for similarities and differences across cases.

A practical example of case study analysis includes examining the implementation of a new curriculum within a single school or comparing how different clinics adopt the same intervention. Case study analysis is also common in evaluation and applied research.

ATLAS.ti and NVivo can support case study analysis by allowing researchers to group data by case, link evidence from different sources, and compare patterns across cases. Case-level attributes, document grouping, and linked memos help maintain a clear structure while preserving contextual detail. A useful tip is to define case boundaries early and reflect those boundaries consistently in how data are organized and coded.

Framework analysis

Framework analysis is used when research is guided by predefined questions, objectives, or policy concerns. It is commonly applied in applied research, program evaluation, and health or policy studies where transparency and comparability across cases are required. This approach works well when researchers need to balance inductive insights with a structured analytic process.

Framework analysis follows a series of steps that include:

  • Familiarization
  • Identifying an analytic framework
  • Indexing data according to that framework
  • Charting data into a matrix
  • Interpreting patterns across cases and categories
  • Data are summarized rather than fully coded line by line, which allows researchers to compare cases systematically while retaining links to the original data.

A practical example of framework analysis includes analyzing stakeholder interviews to assess program implementation across multiple sites or reviewing service user feedback against predefined evaluation criteria.

NVivo supports framework analysis through its framework matrices, which allow coded data to be summarized and compared across cases and categories in a structured table. ATLAS.ti can support similar work by combining codes with case classifications and using tables or network views to compare patterns across groups. A useful tip is to revisit the framework during analysis and adjust categories when the data consistently challenge initial assumptions.

Sentiment analysis

Sentiment analysis is used when the objective is to assess attitudes, opinions, or emotional tone within qualitative text data.

Sentiment analysis is most often applied to large datasets where manually interpreting each response would be impractical, such as:

While commonly associated with automated techniques, sentiment analysis can also be conducted qualitatively.

In qualitative applications, researchers define sentiment categories such as positive, negative, or neutral, and may add more specific labels related to affect or stance. Text segments are reviewed and assigned to these categories, with attention to context, sarcasm, and ambiguity that automated approaches often miss. The analysis may be combined with thematic coding to examine what topics are associated with particular sentiments.

A practical example of sentiment analysis includes analyzing patient feedback to identify sources of dissatisfaction or examining public responses to policy announcements to understand prevailing attitudes.

NVivo includes automated sentiment coding that quickly sorts text into predefined emotional categories that can be reviewed and refined manually, allowing researchers to check classifications against the original text. ATLAS.ti supports sentiment analysis through manual coding and code co-occurrence tools that help examine how sentiment relates to specific themes. A practical tip is to treat automated sentiment results as a starting point and validate them through close reading of a subset of data.

Rhizome analysis

Rhizome analysis is used when researchers want to examine data as non-linear, interconnected, and evolving rather than structured around fixed categories or hierarchies. Drawing on Deleuzian theory, this approach is appropriate for studies that focus on complexity, multiplicity, and shifting relationships, such as research on identity, learning, networks, or digital interaction.

Rather than progressing through predefined steps, rhizome analysis involves tracing connections across data points, concepts, and contexts. Researchers move back and forth between data segments, noting how ideas link, diverge, or reappear in different forms. Coding is often provisional and fluid, with emphasis placed on mapping relationships rather than stabilizing themes. The analysis remains open to reconfiguration as new connections emerge.

A practical example of rhizome analysis includes analyzing online discussion forums to examine how ideas circulate and mutate across participants or studying professional learning communities where practices develop through informal interactions rather than formal structures.

ATLAS.ti is well suited to rhizome analysis through its network view, which allows researchers to visually map relationships among codes, quotations, and memos. NVivo can support this work by enabling flexible coding, extensive memo linking, and exploratory queries across overlapping codes. A useful tip is to resist closing the analysis too early and instead document emerging connections as analytic objects in their own right.

Streamlining analysis with QDA software

QDA software is used to manage complexity, maintain transparency, and support systematic analysis across a wide range of qualitative methods. While software does not take the place of a researcher, it provides structures that make analytic work more consistent, traceable, and collaborative. Tools such as NVivo and ATLAS.ti are commonly used across disciplines and can be adapted to different methodological approaches.

Organization and coding

QDA software helps researchers store, organize, and retrieve large volumes of qualitative data in a single workspace. Interview transcripts, documents, PDFs, audio, video, and images can be imported and grouped logically. Coding tools allow researchers to apply labels to relevant segments, revise code systems over time, and keep track of where codes are used across the dataset.

Lumivero's research software supports overlapping codes, hierarchical code structures, and memo writing linked directly to data, which is useful for methods that require iterative refinement.

Analysis and visualization

Analytic tools support pattern identification and comparison across cases, codes, or data sources. Code co-occurrence tables, queries, and visual maps help researchers examine relationships that may not be apparent through reading alone. Visualizations such as code networks, matrices, or charts can support analytic reasoning and reporting, particularly in larger or team-based projects.

Collaboration and transparency

QDA software, like NVivo and ATLAS.ti, supports collaborative work by enabling shared codebooks, version control, and documentation of analytic decisions. Features such as audit trails, memos, and linked annotations help make the analytic process explicit, which is important for methodological transparency and review.

Looking for a more structured way to manage qualitative data?

Lumivero's research software, NVivo and ATLAS.ti, gives you a structured way to code, compare, and document qualitative data across interviews, field notes, surveys, and more. Spend less time managing files and more time making sense of your data, with tools built for transparent, defensible research.

Buy now
magnifierarrow-right
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram