• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

method of data analysis in qualitative research

Home Market Research

Qualitative Data Analysis: What is it, Methods + Examples

Explore qualitative data analysis with diverse methods and real-world examples. Uncover the nuances of human experiences with this guide.

In a world rich with information and narrative, understanding the deeper layers of human experiences requires a unique vision that goes beyond numbers and figures. This is where the power of qualitative data analysis comes to light.

In this blog, we’ll learn about qualitative data analysis, explore its methods, and provide real-life examples showcasing its power in uncovering insights.

What is Qualitative Data Analysis?

Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights.

In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos. It seeks to understand every aspect of human experiences, perceptions, and behaviors by examining the data’s richness.

Companies frequently conduct this analysis on customer feedback. You can collect qualitative data from reviews, complaints, chat messages, interactions with support centers, customer interviews, case notes, or even social media comments. This kind of data holds the key to understanding customer sentiments and preferences in a way that goes beyond mere numbers.

Importance of Qualitative Data Analysis

Qualitative data analysis plays a crucial role in your research and decision-making process across various disciplines. Let’s explore some key reasons that underline the significance of this analysis:

In-Depth Understanding

It enables you to explore complex and nuanced aspects of a phenomenon, delving into the ‘how’ and ‘why’ questions. This method provides you with a deeper understanding of human behavior, experiences, and contexts that quantitative approaches might not capture fully.

Contextual Insight

You can use this analysis to give context to numerical data. It will help you understand the circumstances and conditions that influence participants’ thoughts, feelings, and actions. This contextual insight becomes essential for generating comprehensive explanations.

Theory Development

You can generate or refine hypotheses via qualitative data analysis. As you analyze the data attentively, you can form hypotheses, concepts, and frameworks that will drive your future research and contribute to theoretical advances.

Participant Perspectives

When performing qualitative research, you can highlight participant voices and opinions. This approach is especially useful for understanding marginalized or underrepresented people, as it allows them to communicate their experiences and points of view.

Exploratory Research

The analysis is frequently used at the exploratory stage of your project. It assists you in identifying important variables, developing research questions, and designing quantitative studies that will follow.

Types of Qualitative Data

When conducting qualitative research, you can use several qualitative data collection methods , and here you will come across many sorts of qualitative data that can provide you with unique insights into your study topic. These data kinds add new views and angles to your understanding and analysis.

Interviews and Focus Groups

Interviews and focus groups will be among your key methods for gathering qualitative data. Interviews are one-on-one talks in which participants can freely share their thoughts, experiences, and opinions.

Focus groups, on the other hand, are discussions in which members interact with one another, resulting in dynamic exchanges of ideas. Both methods provide rich qualitative data and direct access to participant perspectives.

Observations and Field Notes

Observations and field notes are another useful sort of qualitative data. You can immerse yourself in the research environment through direct observation, carefully documenting behaviors, interactions, and contextual factors.

These observations will be recorded in your field notes, providing a complete picture of the environment and the behaviors you’re researching. This data type is especially important for comprehending behavior in their natural setting.

Textual and Visual Data

Textual and visual data include a wide range of resources that can be qualitatively analyzed. Documents, written narratives, and transcripts from various sources, such as interviews or speeches, are examples of textual data.

Photographs, films, and even artwork provide a visual layer to your research. These forms of data allow you to investigate what is spoken and the underlying emotions, details, and symbols expressed by language or pictures.

When to Choose Qualitative Data Analysis over Quantitative Data Analysis

As you begin your research journey, understanding why the analysis of qualitative data is important will guide your approach to understanding complex events. If you analyze qualitative data, it will provide new insights that complement quantitative methodologies, which will give you a broader understanding of your study topic.

It is critical to know when to use qualitative analysis over quantitative procedures. You can prefer qualitative data analysis when:

  • Complexity Reigns: When your research questions involve deep human experiences, motivations, or emotions, qualitative research excels at revealing these complexities.
  • Exploration is Key: Qualitative analysis is ideal for exploratory research. It will assist you in understanding a new or poorly understood topic before formulating quantitative hypotheses.
  • Context Matters: If you want to understand how context affects behaviors or results, qualitative data analysis provides the depth needed to grasp these relationships.
  • Unanticipated Findings: When your study provides surprising new viewpoints or ideas, qualitative analysis helps you to delve deeply into these emerging themes.
  • Subjective Interpretation is Vital: When it comes to understanding people’s subjective experiences and interpretations, qualitative data analysis is the way to go.

You can make informed decisions regarding the right approach for your research objectives if you understand the importance of qualitative analysis and recognize the situations where it shines.

Qualitative Data Analysis Methods and Examples

Exploring various qualitative data analysis methods will provide you with a wide collection for making sense of your research findings. Once the data has been collected, you can choose from several analysis methods based on your research objectives and the data type you’ve collected.

There are five main methods for analyzing qualitative data. Each method takes a distinct approach to identifying patterns, themes, and insights within your qualitative data. They are:

Method 1: Content Analysis

Content analysis is a methodical technique for analyzing textual or visual data in a structured manner. In this method, you will categorize qualitative data by splitting it into manageable pieces and assigning the manual coding process to these units.

As you go, you’ll notice ongoing codes and designs that will allow you to conclude the content. This method is very beneficial for detecting common ideas, concepts, or themes in your data without losing the context.

Steps to Do Content Analysis

Follow these steps when conducting content analysis:

  • Collect and Immerse: Begin by collecting the necessary textual or visual data. Immerse yourself in this data to fully understand its content, context, and complexities.
  • Assign Codes and Categories: Assign codes to relevant data sections that systematically represent major ideas or themes. Arrange comparable codes into groups that cover the major themes.
  • Analyze and Interpret: Develop a structured framework from the categories and codes. Then, evaluate the data in the context of your research question, investigate relationships between categories, discover patterns, and draw meaning from these connections.

Benefits & Challenges

There are various advantages to using content analysis:

  • Structured Approach: It offers a systematic approach to dealing with large data sets and ensures consistency throughout the research.
  • Objective Insights: This method promotes objectivity, which helps to reduce potential biases in your study.
  • Pattern Discovery: Content analysis can help uncover hidden trends, themes, and patterns that are not always obvious.
  • Versatility: You can apply content analysis to various data formats, including text, internet content, images, etc.

However, keep in mind the challenges that arise:

  • Subjectivity: Even with the best attempts, a certain bias may remain in coding and interpretation.
  • Complexity: Analyzing huge data sets requires time and great attention to detail.
  • Contextual Nuances: Content analysis may not capture all of the contextual richness that qualitative data analysis highlights.

Example of Content Analysis

Suppose you’re conducting market research and looking at customer feedback on a product. As you collect relevant data and analyze feedback, you’ll see repeating codes like “price,” “quality,” “customer service,” and “features.” These codes are organized into categories such as “positive reviews,” “negative reviews,” and “suggestions for improvement.”

According to your findings, themes such as “price” and “customer service” stand out and show that pricing and customer service greatly impact customer satisfaction. This example highlights the power of content analysis for obtaining significant insights from large textual data collections.

Method 2: Thematic Analysis

Thematic analysis is a well-structured procedure for identifying and analyzing recurring themes in your data. As you become more engaged in the data, you’ll generate codes or short labels representing key concepts. These codes are then organized into themes, providing a consistent framework for organizing and comprehending the substance of the data.

The analysis allows you to organize complex narratives and perspectives into meaningful categories, which will allow you to identify connections and patterns that may not be visible at first.

Steps to Do Thematic Analysis

Follow these steps when conducting a thematic analysis:

  • Code and Group: Start by thoroughly examining the data and giving initial codes that identify the segments. To create initial themes, combine relevant codes.
  • Code and Group: Begin by engaging yourself in the data, assigning first codes to notable segments. To construct basic themes, group comparable codes together.
  • Analyze and Report: Analyze the data within each theme to derive relevant insights. Organize the topics into a consistent structure and explain your findings, along with data extracts that represent each theme.

Thematic analysis has various benefits:

  • Structured Exploration: It is a method for identifying patterns and themes in complex qualitative data.
  • Comprehensive knowledge: Thematic analysis promotes an in-depth understanding of the complications and meanings of the data.
  • Application Flexibility: This method can be customized to various research situations and data kinds.

However, challenges may arise, such as:

  • Interpretive Nature: Interpreting qualitative data in thematic analysis is vital, and it is critical to manage researcher bias.
  • Time-consuming: The study can be time-consuming, especially with large data sets.
  • Subjectivity: The selection of codes and topics might be subjective.

Example of Thematic Analysis

Assume you’re conducting a thematic analysis on job satisfaction interviews. Following your immersion in the data, you assign initial codes such as “work-life balance,” “career growth,” and “colleague relationships.” As you organize these codes, you’ll notice themes develop, such as “Factors Influencing Job Satisfaction” and “Impact on Work Engagement.”

Further investigation reveals the tales and experiences included within these themes and provides insights into how various elements influence job satisfaction. This example demonstrates how thematic analysis can reveal meaningful patterns and insights in qualitative data.

Method 3: Narrative Analysis

The narrative analysis involves the narratives that people share. You’ll investigate the histories in your data, looking at how stories are created and the meanings they express. This method is excellent for learning how people make sense of their experiences through narrative.

Steps to Do Narrative Analysis

The following steps are involved in narrative analysis:

  • Gather and Analyze: Start by collecting narratives, such as first-person tales, interviews, or written accounts. Analyze the stories, focusing on the plot, feelings, and characters.
  • Find Themes: Look for recurring themes or patterns in various narratives. Think about the similarities and differences between these topics and personal experiences.
  • Interpret and Extract Insights: Contextualize the narratives within their larger context. Accept the subjective nature of each narrative and analyze the narrator’s voice and style. Extract insights from the tales by diving into the emotions, motivations, and implications communicated by the stories.

There are various advantages to narrative analysis:

  • Deep Exploration: It lets you look deeply into people’s personal experiences and perspectives.
  • Human-Centered: This method prioritizes the human perspective, allowing individuals to express themselves.

However, difficulties may arise, such as:

  • Interpretive Complexity: Analyzing narratives requires dealing with the complexities of meaning and interpretation.
  • Time-consuming: Because of the richness and complexities of tales, working with them can be time-consuming.

Example of Narrative Analysis

Assume you’re conducting narrative analysis on refugee interviews. As you read the stories, you’ll notice common themes of toughness, loss, and hope. The narratives provide insight into the obstacles that refugees face, their strengths, and the dreams that guide them.

The analysis can provide a deeper insight into the refugees’ experiences and the broader social context they navigate by examining the narratives’ emotional subtleties and underlying meanings. This example highlights how narrative analysis can reveal important insights into human stories.

Method 4: Grounded Theory Analysis

Grounded theory analysis is an iterative and systematic approach that allows you to create theories directly from data without being limited by pre-existing hypotheses. With an open mind, you collect data and generate early codes and labels that capture essential ideas or concepts within the data.

As you progress, you refine these codes and increasingly connect them, eventually developing a theory based on the data. Grounded theory analysis is a dynamic process for developing new insights and hypotheses based on details in your data.

Steps to Do Grounded Theory Analysis

Grounded theory analysis requires the following steps:

  • Initial Coding: First, immerse yourself in the data, producing initial codes that represent major concepts or patterns.
  • Categorize and Connect: Using axial coding, organize the initial codes, which establish relationships and connections between topics.
  • Build the Theory: Focus on creating a core category that connects the codes and themes. Regularly refine the theory by comparing and integrating new data, ensuring that it evolves organically from the data.

Grounded theory analysis has various benefits:

  • Theory Generation: It provides a one-of-a-kind opportunity to generate hypotheses straight from data and promotes new insights.
  • In-depth Understanding: The analysis allows you to deeply analyze the data and reveal complex relationships and patterns.
  • Flexible Process: This method is customizable and ongoing, which allows you to enhance your research as you collect additional data.

However, challenges might arise with:

  • Time and Resources: Because grounded theory analysis is a continuous process, it requires a large commitment of time and resources.
  • Theoretical Development: Creating a grounded theory involves a thorough understanding of qualitative data analysis software and theoretical concepts.
  • Interpretation of Complexity: Interpreting and incorporating a newly developed theory into existing literature can be intellectually hard.

Example of Grounded Theory Analysis

Assume you’re performing a grounded theory analysis on workplace collaboration interviews. As you open code the data, you will discover notions such as “communication barriers,” “team dynamics,” and “leadership roles.” Axial coding demonstrates links between these notions, emphasizing the significance of efficient communication in developing collaboration.

You create the core “Integrated Communication Strategies” category through selective coding, which unifies new topics.

This theory-driven category serves as the framework for understanding how numerous aspects contribute to effective team collaboration. This example shows how grounded theory analysis allows you to generate a theory directly from the inherent nature of the data.

Method 5: Discourse Analysis

Discourse analysis focuses on language and communication. You’ll look at how language produces meaning and how it reflects power relations, identities, and cultural influences. This strategy examines what is said and how it is said; the words, phrasing, and larger context of communication.

The analysis is precious when investigating power dynamics, identities, and cultural influences encoded in language. By evaluating the language used in your data, you can identify underlying assumptions, cultural standards, and how individuals negotiate meaning through communication.

Steps to Do Discourse Analysis

Conducting discourse analysis entails the following steps:

  • Select Discourse: For analysis, choose language-based data such as texts, speeches, or media content.
  • Analyze Language: Immerse yourself in the conversation, examining language choices, metaphors, and underlying assumptions.
  • Discover Patterns: Recognize the dialogue’s reoccurring themes, ideologies, and power dynamics. To fully understand the effects of these patterns, put them in their larger context.

There are various advantages of using discourse analysis:

  • Understanding Language: It provides an extensive understanding of how language builds meaning and influences perceptions.
  • Uncovering Power Dynamics: The analysis reveals how power dynamics appear via language.
  • Cultural Insights: This method identifies cultural norms, beliefs, and ideologies stored in communication.

However, the following challenges may arise:

  • Complexity of Interpretation: Language analysis involves navigating multiple levels of nuance and interpretation.
  • Subjectivity: Interpretation can be subjective, so controlling researcher bias is important.
  • Time-Intensive: Discourse analysis can take a lot of time because careful linguistic study is required in this analysis.

Example of Discourse Analysis

Consider doing discourse analysis on media coverage of a political event. You notice repeating linguistic patterns in news articles that depict the event as a conflict between opposing parties. Through deconstruction, you can expose how this framing supports particular ideologies and power relations.

You can illustrate how language choices influence public perceptions and contribute to building the narrative around the event by analyzing the speech within the broader political and social context. This example shows how discourse analysis can reveal hidden power dynamics and cultural influences on communication.

How to do Qualitative Data Analysis with the QuestionPro Research suite?

QuestionPro is a popular survey and research platform that offers tools for collecting and analyzing qualitative and quantitative data. Follow these general steps for conducting qualitative data analysis using the QuestionPro Research Suite:

  • Collect Qualitative Data: Set up your survey to capture qualitative responses. It might involve open-ended questions, text boxes, or comment sections where participants can provide detailed responses.
  • Export Qualitative Responses: Export the responses once you’ve collected qualitative data through your survey. QuestionPro typically allows you to export survey data in various formats, such as Excel or CSV.
  • Prepare Data for Analysis: Review the exported data and clean it if necessary. Remove irrelevant or duplicate entries to ensure your data is ready for analysis.
  • Code and Categorize Responses: Segment and label data, letting new patterns emerge naturally, then develop categories through axial coding to structure the analysis.
  • Identify Themes: Analyze the coded responses to identify recurring themes, patterns, and insights. Look for similarities and differences in participants’ responses.
  • Generate Reports and Visualizations: Utilize the reporting features of QuestionPro to create visualizations, charts, and graphs that help communicate the themes and findings from your qualitative research.
  • Interpret and Draw Conclusions: Interpret the themes and patterns you’ve identified in the qualitative data. Consider how these findings answer your research questions or provide insights into your study topic.
  • Integrate with Quantitative Data (if applicable): If you’re also conducting quantitative research using QuestionPro, consider integrating your qualitative findings with quantitative results to provide a more comprehensive understanding.

Qualitative data analysis is vital in uncovering various human experiences, views, and stories. If you’re ready to transform your research journey and apply the power of qualitative analysis, now is the moment to do it. Book a demo with QuestionPro today and begin your journey of exploration.

LEARN MORE         FREE TRIAL

MORE LIKE THIS

method of data analysis in qualitative research

Customer Experience Lessons from 13,000 Feet — Tuesday CX Thoughts

Aug 20, 2024

insight

Insight: Definition & meaning, types and examples

Aug 19, 2024

employee loyalty

Employee Loyalty: Strategies for Long-Term Business Success 

Jotform vs SurveyMonkey

Jotform vs SurveyMonkey: Which Is Best in 2024

Aug 15, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence

method of data analysis in qualitative research

Qualitative Data Analysis Methods 101:

The “big 6” methods + examples.

By: Kerryn Warren (PhD) | Reviewed By: Eunice Rautenbach (D.Tech) | May 2020 (Updated April 2023)

Qualitative data analysis methods. Wow, that’s a mouthful. 

If you’re new to the world of research, qualitative data analysis can look rather intimidating. So much bulky terminology and so many abstract, fluffy concepts. It certainly can be a minefield!

Don’t worry – in this post, we’ll unpack the most popular analysis methods , one at a time, so that you can approach your analysis with confidence and competence – whether that’s for a dissertation, thesis or really any kind of research project.

Qualitative data analysis methods

What (exactly) is qualitative data analysis?

To understand qualitative data analysis, we need to first understand qualitative data – so let’s step back and ask the question, “what exactly is qualitative data?”.

Qualitative data refers to pretty much any data that’s “not numbers” . In other words, it’s not the stuff you measure using a fixed scale or complex equipment, nor do you analyse it using complex statistics or mathematics.

So, if it’s not numbers, what is it?

Words, you guessed? Well… sometimes , yes. Qualitative data can, and often does, take the form of interview transcripts, documents and open-ended survey responses – but it can also involve the interpretation of images and videos. In other words, qualitative isn’t just limited to text-based data.

So, how’s that different from quantitative data, you ask?

Simply put, qualitative research focuses on words, descriptions, concepts or ideas – while quantitative research focuses on numbers and statistics . Qualitative research investigates the “softer side” of things to explore and describe , while quantitative research focuses on the “hard numbers”, to measure differences between variables and the relationships between them. If you’re keen to learn more about the differences between qual and quant, we’ve got a detailed post over here .

qualitative data analysis vs quantitative data analysis

So, qualitative analysis is easier than quantitative, right?

Not quite. In many ways, qualitative data can be challenging and time-consuming to analyse and interpret. At the end of your data collection phase (which itself takes a lot of time), you’ll likely have many pages of text-based data or hours upon hours of audio to work through. You might also have subtle nuances of interactions or discussions that have danced around in your mind, or that you scribbled down in messy field notes. All of this needs to work its way into your analysis.

Making sense of all of this is no small task and you shouldn’t underestimate it. Long story short – qualitative analysis can be a lot of work! Of course, quantitative analysis is no piece of cake either, but it’s important to recognise that qualitative analysis still requires a significant investment in terms of time and effort.

Need a helping hand?

method of data analysis in qualitative research

In this post, we’ll explore qualitative data analysis by looking at some of the most common analysis methods we encounter. We’re not going to cover every possible qualitative method and we’re not going to go into heavy detail – we’re just going to give you the big picture. That said, we will of course includes links to loads of extra resources so that you can learn more about whichever analysis method interests you.

Without further delay, let’s get into it.

The “Big 6” Qualitative Analysis Methods 

There are many different types of qualitative data analysis, all of which serve different purposes and have unique strengths and weaknesses . We’ll start by outlining the analysis methods and then we’ll dive into the details for each.

The 6 most popular methods (or at least the ones we see at Grad Coach) are:

  • Content analysis
  • Narrative analysis
  • Discourse analysis
  • Thematic analysis
  • Grounded theory (GT)
  • Interpretive phenomenological analysis (IPA)

Let’s take a look at each of them…

QDA Method #1: Qualitative Content Analysis

Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

With content analysis, you could, for instance, identify the frequency with which an idea is shared or spoken about – like the number of times a Kardashian is mentioned on Twitter. Or you could identify patterns of deeper underlying interpretations – for instance, by identifying phrases or words in tourist pamphlets that highlight India as an ancient country.

Because content analysis can be used in such a wide variety of ways, it’s important to go into your analysis with a very specific question and goal, or you’ll get lost in the fog. With content analysis, you’ll group large amounts of text into codes , summarise these into categories, and possibly even tabulate the data to calculate the frequency of certain concepts or variables. Because of this, content analysis provides a small splash of quantitative thinking within a qualitative method.

Naturally, while content analysis is widely useful, it’s not without its drawbacks . One of the main issues with content analysis is that it can be very time-consuming , as it requires lots of reading and re-reading of the texts. Also, because of its multidimensional focus on both qualitative and quantitative aspects, it is sometimes accused of losing important nuances in communication.

Content analysis also tends to concentrate on a very specific timeline and doesn’t take into account what happened before or after that timeline. This isn’t necessarily a bad thing though – just something to be aware of. So, keep these factors in mind if you’re considering content analysis. Every analysis method has its limitations , so don’t be put off by these – just be aware of them ! If you’re interested in learning more about content analysis, the video below provides a good starting point.

QDA Method #2: Narrative Analysis 

As the name suggests, narrative analysis is all about listening to people telling stories and analysing what that means . Since stories serve a functional purpose of helping us make sense of the world, we can gain insights into the ways that people deal with and make sense of reality by analysing their stories and the ways they’re told.

You could, for example, use narrative analysis to explore whether how something is being said is important. For instance, the narrative of a prisoner trying to justify their crime could provide insight into their view of the world and the justice system. Similarly, analysing the ways entrepreneurs talk about the struggles in their careers or cancer patients telling stories of hope could provide powerful insights into their mindsets and perspectives . Simply put, narrative analysis is about paying attention to the stories that people tell – and more importantly, the way they tell them.

Of course, the narrative approach has its weaknesses , too. Sample sizes are generally quite small due to the time-consuming process of capturing narratives. Because of this, along with the multitude of social and lifestyle factors which can influence a subject, narrative analysis can be quite difficult to reproduce in subsequent research. This means that it’s difficult to test the findings of some of this research.

Similarly, researcher bias can have a strong influence on the results here, so you need to be particularly careful about the potential biases you can bring into your analysis when using this method. Nevertheless, narrative analysis is still a very useful qualitative analysis method – just keep these limitations in mind and be careful not to draw broad conclusions . If you’re keen to learn more about narrative analysis, the video below provides a great introduction to this qualitative analysis method.

QDA Method #3: Discourse Analysis 

Discourse is simply a fancy word for written or spoken language or debate . So, discourse analysis is all about analysing language within its social context. In other words, analysing language – such as a conversation, a speech, etc – within the culture and society it takes place. For example, you could analyse how a janitor speaks to a CEO, or how politicians speak about terrorism.

To truly understand these conversations or speeches, the culture and history of those involved in the communication are important factors to consider. For example, a janitor might speak more casually with a CEO in a company that emphasises equality among workers. Similarly, a politician might speak more about terrorism if there was a recent terrorist incident in the country.

So, as you can see, by using discourse analysis, you can identify how culture , history or power dynamics (to name a few) have an effect on the way concepts are spoken about. So, if your research aims and objectives involve understanding culture or power dynamics, discourse analysis can be a powerful method.

Because there are many social influences in terms of how we speak to each other, the potential use of discourse analysis is vast . Of course, this also means it’s important to have a very specific research question (or questions) in mind when analysing your data and looking for patterns and themes, or you might land up going down a winding rabbit hole.

Discourse analysis can also be very time-consuming  as you need to sample the data to the point of saturation – in other words, until no new information and insights emerge. But this is, of course, part of what makes discourse analysis such a powerful technique. So, keep these factors in mind when considering this QDA method. Again, if you’re keen to learn more, the video below presents a good starting point.

QDA Method #4: Thematic Analysis

Thematic analysis looks at patterns of meaning in a data set – for example, a set of interviews or focus group transcripts. But what exactly does that… mean? Well, a thematic analysis takes bodies of data (which are often quite large) and groups them according to similarities – in other words, themes . These themes help us make sense of the content and derive meaning from it.

Let’s take a look at an example.

With thematic analysis, you could analyse 100 online reviews of a popular sushi restaurant to find out what patrons think about the place. By reviewing the data, you would then identify the themes that crop up repeatedly within the data – for example, “fresh ingredients” or “friendly wait staff”.

So, as you can see, thematic analysis can be pretty useful for finding out about people’s experiences , views, and opinions . Therefore, if your research aims and objectives involve understanding people’s experience or view of something, thematic analysis can be a great choice.

Since thematic analysis is a bit of an exploratory process, it’s not unusual for your research questions to develop , or even change as you progress through the analysis. While this is somewhat natural in exploratory research, it can also be seen as a disadvantage as it means that data needs to be re-reviewed each time a research question is adjusted. In other words, thematic analysis can be quite time-consuming – but for a good reason. So, keep this in mind if you choose to use thematic analysis for your project and budget extra time for unexpected adjustments.

Thematic analysis takes bodies of data and groups them according to similarities (themes), which help us make sense of the content.

QDA Method #5: Grounded theory (GT) 

Grounded theory is a powerful qualitative analysis method where the intention is to create a new theory (or theories) using the data at hand, through a series of “ tests ” and “ revisions ”. Strictly speaking, GT is more a research design type than an analysis method, but we’ve included it here as it’s often referred to as a method.

What’s most important with grounded theory is that you go into the analysis with an open mind and let the data speak for itself – rather than dragging existing hypotheses or theories into your analysis. In other words, your analysis must develop from the ground up (hence the name). 

Let’s look at an example of GT in action.

Assume you’re interested in developing a theory about what factors influence students to watch a YouTube video about qualitative analysis. Using Grounded theory , you’d start with this general overarching question about the given population (i.e., graduate students). First, you’d approach a small sample – for example, five graduate students in a department at a university. Ideally, this sample would be reasonably representative of the broader population. You’d interview these students to identify what factors lead them to watch the video.

After analysing the interview data, a general pattern could emerge. For example, you might notice that graduate students are more likely to read a post about qualitative methods if they are just starting on their dissertation journey, or if they have an upcoming test about research methods.

From here, you’ll look for another small sample – for example, five more graduate students in a different department – and see whether this pattern holds true for them. If not, you’ll look for commonalities and adapt your theory accordingly. As this process continues, the theory would develop . As we mentioned earlier, what’s important with grounded theory is that the theory develops from the data – not from some preconceived idea.

So, what are the drawbacks of grounded theory? Well, some argue that there’s a tricky circularity to grounded theory. For it to work, in principle, you should know as little as possible regarding the research question and population, so that you reduce the bias in your interpretation. However, in many circumstances, it’s also thought to be unwise to approach a research question without knowledge of the current literature . In other words, it’s a bit of a “chicken or the egg” situation.

Regardless, grounded theory remains a popular (and powerful) option. Naturally, it’s a very useful method when you’re researching a topic that is completely new or has very little existing research about it, as it allows you to start from scratch and work your way from the ground up .

Grounded theory is used to create a new theory (or theories) by using the data at hand, as opposed to existing theories and frameworks.

QDA Method #6:   Interpretive Phenomenological Analysis (IPA)

Interpretive. Phenomenological. Analysis. IPA . Try saying that three times fast…

Let’s just stick with IPA, okay?

IPA is designed to help you understand the personal experiences of a subject (for example, a person or group of people) concerning a major life event, an experience or a situation . This event or experience is the “phenomenon” that makes up the “P” in IPA. Such phenomena may range from relatively common events – such as motherhood, or being involved in a car accident – to those which are extremely rare – for example, someone’s personal experience in a refugee camp. So, IPA is a great choice if your research involves analysing people’s personal experiences of something that happened to them.

It’s important to remember that IPA is subject – centred . In other words, it’s focused on the experiencer . This means that, while you’ll likely use a coding system to identify commonalities, it’s important not to lose the depth of experience or meaning by trying to reduce everything to codes. Also, keep in mind that since your sample size will generally be very small with IPA, you often won’t be able to draw broad conclusions about the generalisability of your findings. But that’s okay as long as it aligns with your research aims and objectives.

Another thing to be aware of with IPA is personal bias . While researcher bias can creep into all forms of research, self-awareness is critically important with IPA, as it can have a major impact on the results. For example, a researcher who was a victim of a crime himself could insert his own feelings of frustration and anger into the way he interprets the experience of someone who was kidnapped. So, if you’re going to undertake IPA, you need to be very self-aware or you could muddy the analysis.

IPA can help you understand the personal experiences of a person or group concerning a major life event, an experience or a situation.

How to choose the right analysis method

In light of all of the qualitative analysis methods we’ve covered so far, you’re probably asking yourself the question, “ How do I choose the right one? ”

Much like all the other methodological decisions you’ll need to make, selecting the right qualitative analysis method largely depends on your research aims, objectives and questions . In other words, the best tool for the job depends on what you’re trying to build. For example:

  • Perhaps your research aims to analyse the use of words and what they reveal about the intention of the storyteller and the cultural context of the time.
  • Perhaps your research aims to develop an understanding of the unique personal experiences of people that have experienced a certain event, or
  • Perhaps your research aims to develop insight regarding the influence of a certain culture on its members.

As you can probably see, each of these research aims are distinctly different , and therefore different analysis methods would be suitable for each one. For example, narrative analysis would likely be a good option for the first aim, while grounded theory wouldn’t be as relevant. 

It’s also important to remember that each method has its own set of strengths, weaknesses and general limitations. No single analysis method is perfect . So, depending on the nature of your research, it may make sense to adopt more than one method (this is called triangulation ). Keep in mind though that this will of course be quite time-consuming.

As we’ve seen, all of the qualitative analysis methods we’ve discussed make use of coding and theme-generating techniques, but the intent and approach of each analysis method differ quite substantially. So, it’s very important to come into your research with a clear intention before you decide which analysis method (or methods) to use.

Start by reviewing your research aims , objectives and research questions to assess what exactly you’re trying to find out – then select a qualitative analysis method that fits. Never pick a method just because you like it or have experience using it – your analysis method (or methods) must align with your broader research aims and objectives.

No single analysis method is perfect, so it can often make sense to adopt more than one  method (this is called triangulation).

Let’s recap on QDA methods…

In this post, we looked at six popular qualitative data analysis methods:

  • First, we looked at content analysis , a straightforward method that blends a little bit of quant into a primarily qualitative analysis.
  • Then we looked at narrative analysis , which is about analysing how stories are told.
  • Next up was discourse analysis – which is about analysing conversations and interactions.
  • Then we moved on to thematic analysis – which is about identifying themes and patterns.
  • From there, we went south with grounded theory – which is about starting from scratch with a specific question and using the data alone to build a theory in response to that question.
  • And finally, we looked at IPA – which is about understanding people’s unique experiences of a phenomenon.

Of course, these aren’t the only options when it comes to qualitative data analysis, but they’re a great starting point if you’re dipping your toes into qualitative research for the first time.

If you’re still feeling a bit confused, consider our private coaching service , where we hold your hand through the research process to help you develop your best work.

method of data analysis in qualitative research

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

87 Comments

Richard N

This has been very helpful. Thank you.

netaji

Thank you madam,

Mariam Jaiyeola

Thank you so much for this information

Nzube

I wonder it so clear for understand and good for me. can I ask additional query?

Lee

Very insightful and useful

Susan Nakaweesi

Good work done with clear explanations. Thank you.

Titilayo

Thanks so much for the write-up, it’s really good.

Hemantha Gunasekara

Thanks madam . It is very important .

Gumathandra

thank you very good

Faricoh Tushera

Great presentation

Pramod Bahulekar

This has been very well explained in simple language . It is useful even for a new researcher.

Derek Jansen

Great to hear that. Good luck with your qualitative data analysis, Pramod!

Adam Zahir

This is very useful information. And it was very a clear language structured presentation. Thanks a lot.

Golit,F.

Thank you so much.

Emmanuel

very informative sequential presentation

Shahzada

Precise explanation of method.

Alyssa

Hi, may we use 2 data analysis methods in our qualitative research?

Thanks for your comment. Most commonly, one would use one type of analysis method, but it depends on your research aims and objectives.

Dr. Manju Pandey

You explained it in very simple language, everyone can understand it. Thanks so much.

Phillip

Thank you very much, this is very helpful. It has been explained in a very simple manner that even a layman understands

Anne

Thank nicely explained can I ask is Qualitative content analysis the same as thematic analysis?

Thanks for your comment. No, QCA and thematic are two different types of analysis. This article might help clarify – https://onlinelibrary.wiley.com/doi/10.1111/nhs.12048

Rev. Osadare K . J

This is my first time to come across a well explained data analysis. so helpful.

Tina King

I have thoroughly enjoyed your explanation of the six qualitative analysis methods. This is very helpful. Thank you!

Bromie

Thank you very much, this is well explained and useful

udayangani

i need a citation of your book.

khutsafalo

Thanks a lot , remarkable indeed, enlighting to the best

jas

Hi Derek, What other theories/methods would you recommend when the data is a whole speech?

M

Keep writing useful artikel.

Adane

It is important concept about QDA and also the way to express is easily understandable, so thanks for all.

Carl Benecke

Thank you, this is well explained and very useful.

Ngwisa

Very helpful .Thanks.

Hajra Aman

Hi there! Very well explained. Simple but very useful style of writing. Please provide the citation of the text. warm regards

Hillary Mophethe

The session was very helpful and insightful. Thank you

This was very helpful and insightful. Easy to read and understand

Catherine

As a professional academic writer, this has been so informative and educative. Keep up the good work Grad Coach you are unmatched with quality content for sure.

Keep up the good work Grad Coach you are unmatched with quality content for sure.

Abdulkerim

Its Great and help me the most. A Million Thanks you Dr.

Emanuela

It is a very nice work

Noble Naade

Very insightful. Please, which of this approach could be used for a research that one is trying to elicit students’ misconceptions in a particular concept ?

Karen

This is Amazing and well explained, thanks

amirhossein

great overview

Tebogo

What do we call a research data analysis method that one use to advise or determining the best accounting tool or techniques that should be adopted in a company.

Catherine Shimechero

Informative video, explained in a clear and simple way. Kudos

Van Hmung

Waoo! I have chosen method wrong for my data analysis. But I can revise my work according to this guide. Thank you so much for this helpful lecture.

BRIAN ONYANGO MWAGA

This has been very helpful. It gave me a good view of my research objectives and how to choose the best method. Thematic analysis it is.

Livhuwani Reineth

Very helpful indeed. Thanku so much for the insight.

Storm Erlank

This was incredibly helpful.

Jack Kanas

Very helpful.

catherine

very educative

Wan Roslina

Nicely written especially for novice academic researchers like me! Thank you.

Talash

choosing a right method for a paper is always a hard job for a student, this is a useful information, but it would be more useful personally for me, if the author provide me with a little bit more information about the data analysis techniques in type of explanatory research. Can we use qualitative content analysis technique for explanatory research ? or what is the suitable data analysis method for explanatory research in social studies?

ramesh

that was very helpful for me. because these details are so important to my research. thank you very much

Kumsa Desisa

I learnt a lot. Thank you

Tesfa NT

Relevant and Informative, thanks !

norma

Well-planned and organized, thanks much! 🙂

Dr. Jacob Lubuva

I have reviewed qualitative data analysis in a simplest way possible. The content will highly be useful for developing my book on qualitative data analysis methods. Cheers!

Nyi Nyi Lwin

Clear explanation on qualitative and how about Case study

Ogobuchi Otuu

This was helpful. Thank you

Alicia

This was really of great assistance, it was just the right information needed. Explanation very clear and follow.

Wow, Thanks for making my life easy

C. U

This was helpful thanks .

Dr. Alina Atif

Very helpful…. clear and written in an easily understandable manner. Thank you.

Herb

This was so helpful as it was easy to understand. I’m a new to research thank you so much.

cissy

so educative…. but Ijust want to know which method is coding of the qualitative or tallying done?

Ayo

Thank you for the great content, I have learnt a lot. So helpful

Tesfaye

precise and clear presentation with simple language and thank you for that.

nneheng

very informative content, thank you.

Oscar Kuebutornye

You guys are amazing on YouTube on this platform. Your teachings are great, educative, and informative. kudos!

NG

Brilliant Delivery. You made a complex subject seem so easy. Well done.

Ankit Kumar

Beautifully explained.

Thanks a lot

Kidada Owen-Browne

Is there a video the captures the practical process of coding using automated applications?

Thanks for the comment. We don’t recommend using automated applications for coding, as they are not sufficiently accurate in our experience.

Mathewos Damtew

content analysis can be qualitative research?

Hend

THANK YOU VERY MUCH.

Dev get

Thank you very much for such a wonderful content

Kassahun Aman

do you have any material on Data collection

Prince .S. mpofu

What a powerful explanation of the QDA methods. Thank you.

Kassahun

Great explanation both written and Video. i have been using of it on a day to day working of my thesis project in accounting and finance. Thank you very much for your support.

BORA SAMWELI MATUTULI

very helpful, thank you so much

ngoni chibukire

The tutorial is useful. I benefited a lot.

Thandeka Hlatshwayo

This is an eye opener for me and very informative, I have used some of your guidance notes on my Thesis, I wonder if you can assist with your 1. name of your book, year of publication, topic etc., this is for citing in my Bibliography,

I certainly hope to hear from you

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on June 19, 2020 by Pritha Bhandari . Revised on June 22, 2023.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analyzing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organization?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, other interesting articles, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography , action research , phenomenological research, and narrative research. They share some similarities, but emphasize different aims and perspectives.

Qualitative research approaches
Approach What does it involve?
Grounded theory Researchers collect rich data on a topic of interest and develop theories .
Researchers immerse themselves in groups or organizations to understand their cultures.
Action research Researchers and participants collaboratively link theory to practice to drive social change.
Phenomenological research Researchers investigate a phenomenon or event by describing and interpreting participants’ lived experiences.
Narrative research Researchers examine how stories are told to understand how participants perceive and make sense of their experiences.

Note that qualitative research is at risk for certain research biases including the Hawthorne effect , observer bias , recall bias , and social desirability bias . While not always totally avoidable, awareness of potential biases as you collect and analyze your data can prevent them from impacting your work too much.

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

method of data analysis in qualitative research

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves “instruments” in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analyzing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organize your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorize your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analyzing qualitative data. Although these methods share similar processes, they emphasize different concepts.

Qualitative data analysis
Approach When to use Example
To describe and categorize common words, phrases, and ideas in qualitative data. A market researcher could perform content analysis to find out what kind of language is used in descriptions of therapeutic apps.
To identify and interpret patterns and themes in qualitative data. A psychologist could apply thematic analysis to travel blogs to explore how tourism shapes self-identity.
To examine the content, structure, and design of texts. A media researcher could use textual analysis to understand how news coverage of celebrities has changed in the past decade.
To study communication and how language is used to achieve effects in specific contexts. A political scientist could use discourse analysis to study how politicians generate trust in election campaigns.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Prevent plagiarism. Run a free check.

Researchers must consider practical and theoretical limitations in analyzing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analyzing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalizability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalizable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labor-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organization to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved August 21, 2024, from https://www.scribbr.com/methodology/qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, how to do thematic analysis | step-by-step guide & examples, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

method of data analysis in qualitative research

The Ultimate Guide to Qualitative Research - Part 2: Handling Qualitative Data

method of data analysis in qualitative research

  • Handling qualitative data
  • Transcripts
  • Field notes
  • Survey data and responses
  • Visual and audio data
  • Data organization
  • Data coding
  • Coding frame
  • Auto and smart coding
  • Organizing codes
  • Introduction

What is qualitative data analysis?

Qualitative data analysis methods, how do you analyze qualitative data, content analysis, thematic analysis.

  • Thematic analysis vs. content analysis
  • Narrative research

Phenomenological research

Discourse analysis, grounded theory.

  • Deductive reasoning
  • Inductive reasoning
  • Inductive vs. deductive reasoning
  • Qualitative data interpretation
  • Qualitative data analysis software

Qualitative data analysis

Analyzing qualitative data is the next step after you have completed the use of qualitative data collection methods . The qualitative analysis process aims to identify themes and patterns that emerge across the data.

method of data analysis in qualitative research

In simplified terms, qualitative research methods involve non-numerical data collection followed by an explanation based on the attributes of the data . For example, if you are asked to explain in qualitative terms a thermal image displayed in multiple colors, then you would explain the color differences rather than the heat's numerical value. If you have a large amount of data (e.g., of group discussions or observations of real-life situations), the next step is to transcribe and prepare the raw data for subsequent analysis.

Researchers can conduct studies fully based on qualitative methodology, or researchers can preface a quantitative research study with a qualitative study to identify issues that were not originally envisioned but are important to the study. Quantitative researchers may also collect and analyze qualitative data following their quantitative analyses to better understand the meanings behind their statistical results.

Conducting qualitative research can especially help build an understanding of how and why certain outcomes were achieved (in addition to what was achieved). For example, qualitative data analysis is often used for policy and program evaluation research since it can answer certain important questions more efficiently and effectively than quantitative approaches.

method of data analysis in qualitative research

Qualitative data analysis can also answer important questions about the relevance, unintended effects, and impact of programs, such as:

  • Were expectations reasonable?
  • Did processes operate as expected?
  • Were key players able to carry out their duties?
  • Were there any unintended effects of the program?

The importance of qualitative data analysis

Qualitative approaches have the advantage of allowing for more diversity in responses and the capacity to adapt to new developments or issues during the research process itself. While qualitative analysis of data can be demanding and time-consuming to conduct, many fields of research utilize qualitative software tools that have been specifically developed to provide more succinct, cost-efficient, and timely results.

method of data analysis in qualitative research

Qualitative data analysis is an important part of research and building greater understanding across fields for a number of reasons. First, cases for qualitative data analysis can be selected purposefully according to whether they typify certain characteristics or contextual locations. In other words, qualitative data permits deep immersion into a topic, phenomenon, or area of interest. Rather than seeking generalizability to the population the sample of participants represent, qualitative research aims to construct an in-depth and nuanced understanding of the research topic.

Secondly, the role or position of the researcher in qualitative analysis of data is given greater critical attention. This is because, in qualitative data analysis, the possibility of the researcher taking a ‘neutral' or transcendent position is seen as more problematic in practical and/or philosophical terms. Hence, qualitative researchers are often exhorted to reflect on their role in the research process and make this clear in the analysis.

method of data analysis in qualitative research

Thirdly, while qualitative data analysis can take a wide variety of forms, it largely differs from quantitative research in the focus on language, signs, experiences, and meaning. In addition, qualitative approaches to analysis are often holistic and contextual rather than analyzing the data in a piecemeal fashion or removing the data from its context. Qualitative approaches thus allow researchers to explore inquiries from directions that could not be accessed with only numerical quantitative data.

Establishing research rigor

Systematic and transparent approaches to the analysis of qualitative data are essential for rigor . For example, many qualitative research methods require researchers to carefully code data and discern and document themes in a consistent and credible way.

method of data analysis in qualitative research

Perhaps the most traditional division in the way qualitative and quantitative research have been used in the social sciences is for qualitative methods to be used for exploratory purposes (e.g., to generate new theory or propositions) or to explain puzzling quantitative results, while quantitative methods are used to test hypotheses .

method of data analysis in qualitative research

After you’ve collected relevant data , what is the best way to look at your data ? As always, it will depend on your research question . For instance, if you employed an observational research method to learn about a group’s shared practices, an ethnographic approach could be appropriate to explain the various dimensions of culture. If you collected textual data to understand how people talk about something, then a discourse analysis approach might help you generate key insights about language and communication.

method of data analysis in qualitative research

The qualitative data coding process involves iterative categorization and recategorization, ensuring the evolution of the analysis to best represent the data. The procedure typically concludes with the interpretation of patterns and trends identified through the coding process.

To start off, let’s look at two broad approaches to data analysis.

Deductive analysis

Deductive analysis is guided by pre-existing theories or ideas. It starts with a theoretical framework , which is then used to code the data. The researcher can thus use this theoretical framework to interpret their data and answer their research question .

The key steps include coding the data based on the predetermined concepts or categories and using the theory to guide the interpretation of patterns among the codings. Deductive analysis is particularly useful when researchers aim to verify or extend an existing theory within a new context.

Inductive analysis

Inductive analysis involves the generation of new theories or ideas based on the data. The process starts without any preconceived theories or codes, and patterns, themes, and categories emerge out of the data.

method of data analysis in qualitative research

The researcher codes the data to capture any concepts or patterns that seem interesting or important to the research question . These codes are then compared and linked, leading to the formation of broader categories or themes. The main goal of inductive analysis is to allow the data to 'speak for itself' rather than imposing pre-existing expectations or ideas onto the data.

Deductive and inductive approaches can be seen as sitting on opposite poles, and all research falls somewhere within that spectrum. Most often, qualitative analysis approaches blend both deductive and inductive elements to contribute to the existing conversation around a topic while remaining open to potential unexpected findings. To help you make informed decisions about which qualitative data analysis approach fits with your research objectives, let's look at some of the common approaches for qualitative data analysis.

Content analysis is a research method used to identify patterns and themes within qualitative data. This approach involves systematically coding and categorizing specific aspects of the content in the data to uncover trends and patterns. An often important part of content analysis is quantifying frequencies and patterns of words or characteristics present in the data .

It is a highly flexible technique that can be adapted to various data types , including text, images, and audiovisual content . While content analysis can be exploratory in nature, it is also common to use pre-established theories and follow a more deductive approach to categorizing and quantifying the qualitative data.

method of data analysis in qualitative research

Thematic analysis is a method used to identify, analyze, and report patterns or themes within the data. This approach moves beyond counting explicit words or phrases and focuses on also identifying implicit concepts and themes within the data.

method of data analysis in qualitative research

Researchers conduct detailed coding of the data to ascertain repeated themes or patterns of meaning. Codes can be categorized into themes, and the researcher can analyze how the themes relate to one another. Thematic analysis is flexible in terms of the research framework, allowing for both inductive (data-driven) and deductive (theory-driven) approaches. The outcome is a rich, detailed, and complex account of the data.

Grounded theory is a systematic qualitative research methodology that is used to inductively generate theory that is 'grounded' in the data itself. Analysis takes place simultaneously with data collection , and researchers iterate between data collection and analysis until a comprehensive theory is developed.

Grounded theory is characterized by simultaneous data collection and analysis, the development of theoretical codes from the data, purposeful sampling of participants, and the constant comparison of data with emerging categories and concepts. The ultimate goal is to create a theoretical explanation that fits the data and answers the research question .

Discourse analysis is a qualitative research approach that emphasizes the role of language in social contexts. It involves examining communication and language use beyond the level of the sentence, considering larger units of language such as texts or conversations.

method of data analysis in qualitative research

Discourse analysts typically investigate how social meanings and understandings are constructed in different contexts, emphasizing the connection between language and power. It can be applied to texts of all kinds, including interviews , documents, case studies , and social media posts.

Phenomenological research focuses on exploring how human beings make sense of an experience and delves into the essence of this experience. It strives to understand people's perceptions, perspectives, and understandings of a particular situation or phenomenon.

method of data analysis in qualitative research

It involves in-depth engagement with participants, often through interviews or conversations, to explore their lived experiences. The goal is to derive detailed descriptions of the essence of the experience and to interpret what insights or implications this may bear on our understanding of this phenomenon.

method of data analysis in qualitative research

Whatever your data analysis approach, start with ATLAS.ti

Qualitative data analysis done quickly and intuitively with ATLAS.ti. Download a free trial today.

Now that we've summarized the major approaches to data analysis, let's look at the broader process of research and data analysis. Suppose you need to do some research to find answers to any kind of research question, be it an academic inquiry, business problem, or policy decision. In that case, you need to collect some data. There are many methods of collecting data: you can collect primary data yourself by conducting interviews, focus groups , or a survey , for instance. Another option is to use secondary data sources. These are data previously collected for other projects, historical records, reports, statistics – basically everything that exists already and can be relevant to your research.

method of data analysis in qualitative research

The data you collect should always be a good fit for your research question . For example, if you are interested in how many people in your target population like your brand compared to others, it is no use to conduct interviews or a few focus groups . The sample will be too small to get a representative picture of the population. If your questions are about "how many….", "what is the spread…" etc., you need to conduct quantitative research . If you are interested in why people like different brands, their motives, and their experiences, then conducting qualitative research can provide you with the answers you are looking for.

Let's describe the important steps involved in conducting research.

Step 1: Planning the research

As the saying goes: "Garbage in, garbage out." Suppose you find out after you have collected data that

  • you talked to the wrong people
  • asked the wrong questions
  • a couple of focus groups sessions would have yielded better results because of the group interaction, or
  • a survey including a few open-ended questions sent to a larger group of people would have been sufficient and required less effort.

Think thoroughly about sampling, the questions you will be asking, and in which form. If you conduct a focus group or an interview, you are the research instrument, and your data collection will only be as good as you are. If you have never done it before, seek some training and practice. If you have other people do it, make sure they have the skills.

method of data analysis in qualitative research

Step 2: Preparing the data

When you conduct focus groups or interviews, think about how to transcribe them. Do you want to run them online or offline? If online, check out which tools can serve your needs, both in terms of functionality and cost. For any audio or video recordings , you can consider using automatic transcription software or services. Automatically generated transcripts can save you time and money, but they still need to be checked. If you don't do this yourself, make sure that you instruct the person doing it on how to prepare the data.

  • How should the final transcript be formatted for later analysis?
  • Which names and locations should be anonymized?
  • What kind of speaker IDs to use?

What about survey data ? Some survey data programs will immediately provide basic descriptive-level analysis of the responses. ATLAS.ti will support you with the analysis of the open-ended questions. For this, you need to export your data as an Excel file. ATLAS.ti's survey import wizard will guide you through the process.

Other kinds of data such as images, videos, audio recordings, text, and more can be imported to ATLAS.ti. You can organize all your data into groups and write comments on each source of data to maintain a systematic organization and documentation of your data.

method of data analysis in qualitative research

Step 3: Exploratory data analysis

You can run a few simple exploratory analyses to get to know your data. For instance, you can create a word list or word cloud of all your text data or compare and contrast the words in different documents. You can also let ATLAS.ti find relevant concepts for you. There are many tools available that can automatically code your text data, so you can also use these codings to explore your data and refine your coding.

method of data analysis in qualitative research

For instance, you can get a feeling for the sentiments expressed in the data. Who is more optimistic, pessimistic, or neutral in their responses? ATLAS.ti can auto-code the positive, negative, and neutral sentiments in your data. Naturally, you can also simply browse through your data and highlight relevant segments that catch your attention or attach codes to begin condensing the data.

method of data analysis in qualitative research

Step 4: Build a code system

Whether you start with auto-coding or manual coding, after having generated some first codes, you need to get some order in your code system to develop a cohesive understanding. You can build your code system by sorting codes into groups and creating categories and subcodes. As this process requires reading and re-reading your data, you will become very familiar with your data. Counting on a tool like ATLAS.ti qualitative data analysis software will support you in the process and make it easier to review your data, modify codings if necessary, change code labels, and write operational definitions to explain what each code means.

method of data analysis in qualitative research

Step 5: Query your coded data and write up the analysis

Once you have coded your data, it is time to take the analysis a step further. When using software for qualitative data analysis , it is easy to compare and contrast subsets in your data, such as groups of participants or sets of themes.

method of data analysis in qualitative research

For instance, you can query the various opinions of female vs. male respondents. Is there a difference between consumers from rural or urban areas or among different age groups or educational levels? Which codes occur together throughout the data set? Are there relationships between various concepts, and if so, why?

Step 6: Data visualization

Data visualization brings your data to life. It is a powerful way of seeing patterns and relationships in your data. For instance, diagrams allow you to see how your codes are distributed across documents or specific subpopulations in your data.

method of data analysis in qualitative research

Exploring coded data on a canvas, moving around code labels in a virtual space, linking codes and other elements of your data set, and thinking about how they are related and why – all of these will advance your analysis and spur further insights. Visuals are also great for communicating results to others.

Step 7: Data presentation

The final step is to summarize the analysis in a written report . You can now put together the memos you have written about the various topics, select some salient quotes that illustrate your writing, and add visuals such as tables and diagrams. If you follow the steps above, you will already have all the building blocks, and you just have to put them together in a report or presentation.

When preparing a report or a presentation, keep your audience in mind. Does your audience better understand numbers than long sections of detailed interpretations? If so, add more tables, charts, and short supportive data quotes to your report or presentation. If your audience loves a good interpretation, add your full-length memos and walk your audience through your conceptual networks and illustrative data quotes.

method of data analysis in qualitative research

Qualitative data analysis begins with ATLAS.ti

For tools that can make the most out of your data, check out ATLAS.ti with a free trial.

  • AI & NLP
  • Churn & Loyalty
  • Customer Experience
  • Customer Journeys
  • Customer Metrics
  • Feedback Analysis
  • Product Experience
  • Product Updates
  • Sentiment Analysis
  • Surveys & Feedback Collection
  • Text Analytics
  • Try Thematic

Welcome to the community

method of data analysis in qualitative research

The Primary Methods of Qualitative Data Analysis

In academic research as well as in the business landscape, qualitative data analysis plays a crucial role in understanding and interpreting non-numerical data.

Qualitative data analysis helps us make sense of the stories and personal narratives. In the business context, qualitative data analysis turns customer feedback into an in-depth understanding of what matters to customers. Sharing the insights from this analysis with decision-makers helps them drive initiatives that improve customer experiences.

While quantitative data analysis focuses on numerical measurement and statistical analysis, qualitative data analysis delves into the rich and complex nature of human experiences and perceptions.  When analyzed effectively, customer feedback can be transformed into actionable insights for every team across the company.

This guide will provide an in-depth exploration of the different methods employed in qualitative data analysis, as well as the steps involved and challenges encountered. We’ll also have a look at what QDA means in the business context and how to turn it into a high-powered tool for CX and product teams.

Understanding Qualitative Data Analysis Methods

Definition and importance of qualitative data analysis.

Qualitative data analysis refers to the systematic process of examining and interpreting non-numerical data to gain meaningful insights and generate new knowledge. It’s what happens when you put a year’s worth of Amazon reviews into a thematic analysis engine, and end up with a thorough understanding of how users interact with your product (and half a dozen actionable insights to boot).

It involves dissecting text, images, videos, and other forms of qualitative data to identify patterns, themes, and relationships.

By capturing the nuances and depth of human experiences, the qualitative data analysis approach allows researchers to explore complex social phenomena that quantitative approaches cannot fully capture. It provides a rich and detailed understanding of social contexts, individual perspectives, and subjective experiences.

Qualitative data analysis methods offer an in-depth exploration of the hows and whys behind social phenomena, enabling researchers to gain a comprehensive understanding of complex social issues.  It is incredibly valuable in fields such as sociology, anthropology, psychology, and education, where human behavior and social interactions are studied.

In these fields, researchers often seek to understand the intricacies of human experiences, and qualitative data analysis allows them to capture the complexity of these phenomena.

In the world of business & product development, qualitative data analysis methods can work to improve user experiences. Suddenly, you’ve got the opportunity to reach a comprehensive understanding of just what your products mean on the social landscape.

User feedback gets transformed into big-picture knowledge that offers a 360-degree view of how a product performs in the real world.  Product teams get a solid, reliable basis on which to make decisions , and guesswork becomes a thing of the past.

Key Principles of Qualitative Data Analysis

Before delving into the various methods of qualitative data analysis, let’s look at the key principles that underpin these analysis techniques. Qualitative data analysis is guided by the following principles:

  • Inductive Reasoning: Qualitative research focuses on specific observations and gradually develops broader interpretations and theories. It allows for the discovery of new patterns and relationships through an iterative process of data investigation.
  • Contextual Understanding: Qualitative data analysis emphasizes the importance of understanding the research context and the social, cultural, and historical factors that shape it. Context provides meaning and helps researchers identify themes as well as interpret and make sense of the data.
  • Subjectivity and Reflexivity: When research is human-led, the researchers acknowledge and critically reflect upon their own beliefs, biases, and experiences throughout the qualitative data analysis process. Where research is AI-driven, humans get a chance to view the actual data each insight is based on and check to see if it makes objective sense.
  • Active Engagement: A qualitative data analysis method is an active and dynamic process that involves constant engagement with the data. Thematic analysis works most effectively as an ongoing process,  thoroughly examining and interpreting all available data, while continually questioning and refining the research questions and analysis as new data points are added.

Inductive reasoning is a fundamental principle of qualitative data analysis. It allows researchers to start with specific observations and gradually develop broader interpretations and theories. Through this iterative process of data investigation, new patterns and relationships can be discovered. When you’ve got AI-driven data analysis software, this inductive reasoning is going on under the hood.

Contextual understanding is another key principle of the qualitative analysis process. It emphasizes the importance of understanding the research context and the social, cultural, and historical factors that shape it.

By considering the context when analyzing qualitative data, researchers can gain a deeper understanding of the data and interpret it more accurately. Well-designed thematic analysis software has this built in.

Subjectivity and reflexivity are essential principles in qualitative data analysis. Qualitative data analysis research must be repeatable if it is to be relied on, and there should always be ways to check just what qualitative feedback particular trends and insights come from. When qualitative data analysis is done right, transparency and rigor can be maintained throughout the process, from the initial selection of research questions and gathering of raw data to final analysis techniques.

Active engagement is a crucial aspect of qualitative feedback interpretation. It involves constant engagement with the data, as researchers thoroughly examine and interpret it. This active and dynamic process allows researchers to continually question and refine their qualitative analysis, ensuring a comprehensive understanding of the data.

Different Qualitative Data Analysis Methods

Just how does qualitative analysis work out in practice? In this article, we will explore five commonly used qualitative analysis methods: content analysis, narrative analysis, discourse analysis,  grounded theory, and thematic analysis.

Flowchart diagram of the steps involved for content analysis

  • Content Analysis

Content analysis is a systematic and objective approach to analyzing data by categorizing, coding, and quantifying specific words, themes, or concepts within a text. It involves identifying patterns, frequencies, and relationships in the content, which can be textual, visual, or auditory.

Researchers can employ content analysis techniques to examine interviews, focus group discussions, newspaper articles, social media posts, and other forms of textual data. By assigning codes to different segments of the text, researchers can identify recurring themes, sentiments, or messages.

This same qualitative data analysis approach can be used by CX and product teams to analyze customer feedback or support tickets.

For example, in an analysis of public response to a new product, a PX team might use content analysis to analyze social media posts discussing the topic.

By categorizing the posts based on their stance (e.g., positive, negative, neutral) and identifying recurring themes (e.g., user experience, look and feel), a company could gain insights into the dominant narratives and public perceptions surrounding the product launch.

Study on the experiences of cancer survivors, researchers may conduct narrative analysis on interviews with survivors.

  • Narrative Analysis

Narrative analysis focuses on interpreting and understanding the stories and personal narratives shared by individuals. Researchers analyze the structure, content, and meaning of these narratives to gain insights into how individuals make sense of their experiences, construct identities, and communicate their perspectives.

Through narrative analysis techniques, qualitative researchers explore the plot, characters, setting, and themes within a narrative. They examine how the narrator constructs meaning, conveys emotions, and positions themselves within the story.

This same narrative analysis method is often used in psychology, sociology, and anthropology to understand identity formation, life histories, and personal narratives. It can be used in a business setting to analyze long-form responses and user interviews or descriptions of user behavior.

For instance, in a study on the experiences of cancer survivors, researchers may conduct narrative analysis on interviews with survivors. By examining the narratives, researchers can identify common themes such as coping strategies, support systems, and personal growth.

This qualitative analysis process can provide valuable insights into the lived experiences of cancer survivors and inform interventions and support programs.

Elon Musk next to the new x logo on top of the old twitter logo with feedback from users

  • Discourse Analysis

Discourse analysis examines the social, cultural, and power relations that shape language use in different contexts. It focuses on the ways in which language constructs and reflects social reality, identities, and ideologies.

Researchers employing discourse analysis analyze data that includes spoken or written language, including interviews, speeches, media articles, and conversations.

They examine linguistic features such as metaphors, power dynamics, framing, and silences to uncover underlying social structures and processes.

For example, in a study on gender representation in media, researchers may use discourse analysis to analyze television advertisements. By examining the language, visual cues, and narratives used in the advertisements, researchers can identify how gender roles and stereotypes are constructed and reinforced.

It can shed light on the ways in which media perpetuates or challenges societal norms and expectations.

Another example might be using discourse analysis to analyze Tik Tok and YouTube videos to understand the societal responses to a rebranding; for instance, that from Twitter to X. Customer interviews are another good source for this analysis method.

  • Grounded Theory

Grounded theory is an approach to qualitative analysis that aims to develop theories and concepts grounded in data. It involves iterative data collection and analysis to develop an inductive theory that emerges from the unstructured data itself.

Researchers using grounded theory analyze interviews, observations, and textual data to generate concepts and categories.

These concepts are continually refined and developed through theoretical sampling and constant comparison. Grounded theory analysis is particularly useful when exploring complex social phenomena where existing theories may be limited.

For instance, in a study on the experiences of individuals living with chronic pain, researchers may use grounded theory to analyze interviews with participants. Through iterative analysis, researchers can identify key concepts such as pain management strategies, social support networks, and psychological coping mechanisms.

These concepts can then be used to develop a theoretical framework within grounded theory that captures the multidimensional nature of living with chronic pain.

Although historically grounded theory analysis has been primarily used in the social sciences, grounded theory has also been used successfully for business inquiry.

  • Thematic Analysis

Thematic analysis is a widely used method in qualitative data analysis that involves identifying, analyzing, and reporting patterns or themes within data. It is a flexible approach that can be applied across a variety of qualitative data, such as interview transcripts, survey responses, and observational notes.

When thematic analysis is done manually, researchers initially familiarize themselves with the raw data, reading through the material multiple times to gain a deep understanding.

Following this, they begin manual coding. The first step is to generate initial codes, which are tags or labels that identify important features of the data relevant to the research question.

These codes are then collated into potential themes, which are broader patterns that emerge across the data set.

Each theme is then reviewed and refined to ensure it accurately represents the coded data and the overall data set. The final step involves defining and naming the themes, during which researchers provide detailed analysis, including how themes relate to each other and to the research question.

Sound complicated? The great news is that advances in artificial intelligence mean we no longer have to do all that by hand.

Thematic analysis software can process thousands of pieces of consumer feedback in a matter of minutes, providing a user-friendly view of the themes and trends in the customer data pool.

What’s more, this type of software can be programmed to do content analysis, discourse analysis, and narrative analysis at the same time.   The best comprehensive business solution for thematic analysis today is Thematic; a comprehensive feedback analysis that is designed for customer-centric businesses. It makes qualitative user analysis accessible to anyone, and is able to process feedback at scale.

Across disciplines, thematic analysis is particularly valued for its ability to provide a rich and detailed, yet complex account of data. It's a method that is accessible to researchers across different levels of qualitative research experience and can be applied to a variety of theoretical and epistemological approaches, making it a versatile tool in qualitative work.

Thematic view of product view with data sources being piped in automatically, showing volume and qualitative summary

Steps in Qualitative Data Analysis

Data collection.

Data collection is the initial phase of qualitative research and data analysis. It involves selecting appropriate methods to gather data such as interviews, observations, focus groups, or archival research.

Researchers may employ various techniques to collect data. These can include developing interview protocols, conducting observations, or collecting data using audio-visual recording devices.

They may need to consider ethical considerations, ensure informed consent, and establish rapport with participants to obtain rich and reliable data. The goal is to gather qualitative data that is relevant, comprehensive, and representative of the research topic.

Qualitative research questions can be more open-ended than those used for gathering quantitative data, and the research findings have the potential to be far more extensive.

In a business context, much of the work is done for you by customers who provide feedback in reviews, on support tickets, and on social media. Customer interviews are another possible source of rich data.

Data Coding

Data coding is the process of categorizing and organizing qualitative data into meaningful segments. When this is done manually, researchers assign codes to different parts of the data based on the emerging patterns, themes, or concepts identified during analysis. This coding process helps researchers manage and make sense of large amounts of qualitative data.

There are different types of codes used in analyzing raw data, including descriptive codes, interpretive codes, and conceptual codes.

Descriptive codes capture the content and surface-level meaning of all the data, while interpretive codes delve deeper into the underlying meanings and interpretations. A conceptual coding system further abstracts the research data by identifying broader concepts or theories.

Data Interpretation

Data interpretation involves making sense of the coded data and exploring the relationships, themes, and patterns that emerge from the analysis. Researchers critically examine the data, compare different codes, and then identify themes and connections between categories and concepts.

During data interpretation, researchers may engage in constant comparison, where they continually compare new data to existing codes and categories. This iterative process helps refine the analysis and identify theoretical insights.

It involves synthesizing the findings of qualitative and quantitative data and crafting a narrative that presents a comprehensive understanding of the research phenomenon.

Both data coding and data interpretation can be done by your qualitative analytics software, either in a research or business setting. In a corporate setting, CX /PX teams and customer service can then use information gained through the data interpretation step to drive favourable outcomes.

Performing Qualitative Data Analysis with Generative AI and LLM

Running manual grounded theory analysis or content analysis on a large amount of consumer feedback has never been a practical option. But that doesn’t mean qualitative research doesn’t make sense in a business context.

Generative AI, based on large language models (LLMs) can work with qualitative data at scale, analyze it, and derive the themes, connections and insights that can inform business decisions.

An LLM is a powerful machine learning model, based on deep learning and neural networks. It’s able to process and identify the complex relationships in natural language, and it can also understand user questions and  moods and even generate text.

A natural languague processing LLM, trained on huge amounts of text data, could do all the work of a QDA researcher with the added benefits of easily verifiable, repeatable results.

Companies with extensive  in-house talent  may be able to build an in-house AI engine to analyze customer feedback and make sense of it— on a small scale. Those who are serious about getting real insights, though, will want to go with professional tools that have been trained on massive amounts of data and give reliable, dependable results.

Thematic is probably the best example of such a tool. Built to make sense of any amount of feedback data,  it works in a highly transparent way that will leave you confident in every insight you derive.

It’s also incredibly user-friendly, with helpful visualizations and an easy-to-use dashboard that enables you to keep constant tabs on exactly what your users feel about the company. It’s never been easier to transform your user experience.

Modern Methods of Qualitative Data Analysis in Action: A Case Study

Abstract image of 3 Instacart shopping bags ascending in size to mimic a chart of growth with Thematic

Instacart is one example of a company that discovered the power of qualitative data analysis. This company has 10 million end users, 500,000 personal shoppers, and more than 40,000 retailers. Processing all this qualitative data the traditional way would have been impossible, but Ant Marty,  product operations team manager, found a method that worked.

Plugging data from the app into Thematic, she got real time information on everything happening among those millions of users: trends, themes, and deep understanding of what mattered to the people who made the company run.

Data collection is easy when you have an app with numerous feedback collection options.  Data coding is automated by Thematic. And Thematic makes the first move in interpretation as well, providing insights that can be transformed by product teams into action plans and even a long-term vision.

Challenges Facing Qualitative Data Analysis Methods

Ensuring data validity and reliability.

One of the main challenges to a qualitative approach is ensuring the validity and reliability of the findings. Validity refers to the accuracy, truthfulness, and credibility of the data collected and analysis, while reliability refers to the consistency and replicability of the research process and findings.

Researchers address these challenges by employing rigorous data collection methods, ensuring data saturation, conducting member checks, and establishing inter-rater reliability. They also maintain reflexivity by critically reflecting on their assumptions, biases, and interpretations throughout the analysis process.

If you are a business using software to conduct qualitative research, your data validation check may be somewhat different, but it’s just as important.  Some software, like Thematic, has validation built in, and the whole process is so transparent you can easily check and double-check where each insight comes from .

With other software options, you may have to run manual checks to ensure every piece of information provided has a firm basis.

Dealing with Subjectivity and Bias

Subjectivity and bias used to be considered inherent to qualitative research methods due to the interpretive nature of the process. Researchers bring their own perspectives, beliefs, and experiences, which can influence the analysis and interpretations.

To mitigate subjectivity and bias, researchers maintain transparency in their analytical processes by documenting their decision-making, providing detailed justifications for their interpretations, and engaging in peer debriefing and member checking. Using multiple researchers or an expert panel can also increase the credibility and reliability of the analysis.

Another way to decrease subjectivity is through thematic analysis software, which produces results that are repeatable and verifiable.

When it is all said and done, qualitative analysis offers a powerful and nuanced examination of human experiences and social phenomena. By employing diverse methods, adhering to key principles, and addressing potential limitations, researchers can harness the full potential of qualitative data to uncover rich insights and contribute to the advancement of knowledge.

Benefits of Qualitative Data Analysis Methods

Rich, in-depth insights.

A primary benefit of qualitative research techniques is their ability to provide rich, in-depth insights into complex phenomena. These methods delve deeply into human experiences, emotions, beliefs, and behaviors, offering a comprehensive understanding that is often unattainable through quantitative methods.

By exploring the nuances and subtleties of social interactions and personal experiences, qualitative analysis can uncover the layers of meaning that underpin human behavior. This depth of understanding is particularly valuable in fields like psychology, sociology, and anthropology, where the intricacies of human experience are central to the research question.

It is even more important for customer-focused businesses and enables them to create a product and a CX that meets their customer’s needs and desires. Quantitative analysis can provide a one-dimensional understanding of user behavior based on quantitative data, but when analysing qualitative data you get the why to every what.

Flexibility and Contextual Understanding

Another significant advantage of these analysis techniques is their inherent flexibility and capacity to provide contextual understanding. Unlike quantitative research, which relies on rigid structures and predefined hypotheses, qualitative research is adaptable to the evolving nature of the study.

This flexibility allows researchers to explore unexpected themes and patterns that emerge during the data collection process.  Qualitative analysis is how businesses like Atlassian have created infinite customer feedback loops and powered their own infinitely evolving products.

Additionally, qualitative methods are sensitive to the context in which the data is collected, acknowledging and incorporating the environmental, cultural, and social factors that influence the data. The context-rich approach used to collect qualitative data ensures a more holistic understanding of the subject matter, making it particularly useful in cross-cultural studies, community research, and exploratory investigations.

Your product may have global reach, and users in different areas may interact with it in different ways– but qualitative techniques can take all that into account.

This considered, it should be no surprise that qualitative analysis techniques have become powerful tools for researchers seeking to understand the complexities of human behavior and social phenomena. Their ability to provide depth, context, and rich narrative data makes them indispensable tools in the arsenal of social science research, and there’s no better way to gain solid information to guide your business decisions.

Whether you’re a researcher keen on analyzing and interpreting qualitative data or an entrepreneur keen on making your business more customer-centric, this research method is likely to become your next best friend.

If you’re in academia, you may want to do it all manually, and that’s totally okay. But if it’s business intelligence you’re after— try out Thematic. Your future self will thank you, as will everyone else who views the end-of-year reports.

What are the five methods to analyze qualitative data?

The five chief methods of qualitative data analysis are:

The right analysis method for your use case will depend on what context, your research questions, and the form of data available to you.

What are good sources of data for qualitative data analysis?

In a business context user reviews, support tickets, customer surveys and social media posts are all great sources of data for qualitative analysis. In a research project, gathering qualitative data may mean conducting interviews, surveys, or focus groups.

What are the benefits of qualitative data analysis?

Two big benefits of qualitative data analysis include:

  • Rich, in-depth insights
  • Flexibility and contextual understanding

In a business context, this translates into a loyal, well-satisfied user base, a successful product, and an upwards-ticking revenue curve. Research objectives for social sciences may include a better understanding of social dynamics or human relations.

What are the challenges of qualitative data analysis?

The two prime challenges of qualitative data analysis techniques are:

  • Ensuring data validity and reliability
  • Dealing with subjectivity and bias

What is the best tool for qualitative data analysis?

While a number of other options do exist, the best comprehensive software for qualitative data analysis in a business context today is Thematic.

method of data analysis in qualitative research

Digital Marketing

Tyler manages Thematic's content & aims to create useful & informative content for CX, Product & Insight professionals.

We make it easy to discover the customer and product issues that matter.

Unlock the value of feedback at scale, in one platform. Try it for free now!

  • Questions to ask your Feedback Analytics vendor
  • How to end customer churn for good
  • Scalable analysis of NPS verbatims
  • 5 Text analytics approaches
  • How to calculate the ROI of CX

Our experts will show you how Thematic works, how to discover pain points and track the ROI of decisions. To access your free trial, book a personal demo today.

Recent posts

Become a qualitative theming pro! Creating a perfect code frame is hard, but thematic analysis software makes the process much easier.

Discover the power of thematic analysis to unlock insights from qualitative data. Learn about manual vs. AI-powered approaches, best practices, and how Thematic software can revolutionize your analysis workflow.

When two major storms wreaked havoc on Auckland and Watercare’s infrastructurem the utility went through a CX crisis. With a massive influx of calls to their support center, Thematic helped them get inisghts from this data to forge a new approach to restore services and satisfaction levels.

  • Usability testing

Run remote usability tests on any digital product to deep dive into your key user flows

  • Product analytics

Learn how users are behaving on your website in real time and uncover points of frustration

  • Research repository

A tool for collaborative analysis of qualitative data and for building your research repository and database.

Trymata Blog

How-to articles, expert tips, and the latest news in user testing & user experience

Knowledge Hub

Detailed explainers of Trymata’s features & plans, and UX research terms & topics

  • Plans & Pricing

Get paid to test

  • For UX & design teams
  • For product teams
  • For marketing teams
  • For ecommerce teams
  • For agencies
  • For startups & VCs
  • Customer Stories

How do you want to use Trymata?

Conduct user testing, desktop usability video.

You’re on a business trip in Oakland, CA. You've been working late in downtown and now you're looking for a place nearby to grab a late dinner. You decided to check Zomato to try and find somewhere to eat. (Don't begin searching yet).

  • Look around on the home page. Does anything seem interesting to you?
  • How would you go about finding a place to eat near you in Downtown Oakland? You want something kind of quick, open late, not too expensive, and with a good rating.
  • What do the reviews say about the restaurant you've chosen?
  • What was the most important factor for you in choosing this spot?
  • You're currently close to the 19th St Bart station, and it's 9PM. How would you get to this restaurant? Do you think you'll be able to make it before closing time?
  • Your friend recommended you to check out a place called Belly while you're in Oakland. Try to find where it is, when it's open, and what kind of food options they have.
  • Now go to any restaurant's page and try to leave a review (don't actually submit it).

What was the worst thing about your experience?

It was hard to find the bart station. The collections not being able to be sorted was a bit of a bummer

What other aspects of the experience could be improved?

Feedback from the owners would be nice

What did you like about the website?

The flow was good, lots of bright photos

What other comments do you have for the owner of the website?

I like that you can sort by what you are looking for and i like the idea of collections

You're going on a vacation to Italy next month, and you want to learn some basic Italian for getting around while there. You decided to try Duolingo.

  • Please begin by downloading the app to your device.
  • Choose Italian and get started with the first lesson (stop once you reach the first question).
  • Now go all the way through the rest of the first lesson, describing your thoughts as you go.
  • Get your profile set up, then view your account page. What information and options are there? Do you feel that these are useful? Why or why not?
  • After a week in Italy, you're going to spend a few days in Austria. How would you take German lessons on Duolingo?
  • What other languages does the app offer? Do any of them interest you?

I felt like there could have been a little more of an instructional component to the lesson.

It would be cool if there were some feature that could allow two learners studying the same language to take lessons together. I imagine that their screens would be synced and they could go through lessons together and chat along the way.

Overall, the app was very intuitive to use and visually appealing. I also liked the option to connect with others.

Overall, the app seemed very helpful and easy to use. I feel like it makes learning a new language fun and almost like a game. It would be nice, however, if it contained more of an instructional portion.

All accounts, tests, and data have been migrated to our new & improved system!

Use the same email and password to log in:

Legacy login: Our legacy system is still available in view-only mode, login here >

What’s the new system about? Read more about our transition & what it-->

What is Qualitative Data Analysis? Definition, Types, Methods, Examples and Best Practices 

' src=

What is Qualitative Data Analysis?

Qualitative data analysis is defined as a systematic process used to interpret and make sense of non-numerical data, focusing on the exploration of meanings, patterns, and themes. Unlike quantitative data, which deals with measurable quantities, qualitative data involves subjective information such as text, images, or audio. Qualitative data analysis is commonly employed in social sciences, humanities, and other fields where understanding the context and nuances of data is crucial.

The first step in qualitative data analysis involves data preparation, where researchers organize and structure their raw data. This may include transcribing interviews, categorizing information, or coding textual data. Once the data is organized, researchers move on to exploration, seeking patterns, connections, and recurring themes within the information. This phase often involves techniques like content analysis or grounded theory, allowing for a deeper understanding of the underlying concepts.

The third phase, data interpretation, involves deriving meaning from the identified patterns. Researchers critically analyze the data to develop insights, draw conclusions, and construct a narrative that explains the findings. Interpretation in qualitative analysis is subjective and context-dependent, emphasizing the importance of the researcher’s perspective and reflexivity.

Finally, researchers communicate their findings through comprehensive reports or presentations, providing a rich and contextualized understanding of the studied phenomenon. Qualitative data analysis contributes valuable insights to research by uncovering nuanced perspectives and offering a deeper understanding of complex social phenomena, enhancing the overall knowledge in diverse academic disciplines.

Key Components of Qualitative Data Analysis

Qualitative data analysis involves several key components that contribute to a comprehensive understanding of non-numerical data:

  • Data Preparation:

Organization: Structuring and organizing raw data, including tasks like transcription and categorization.

Cleaning: Ensuring data quality by addressing inconsistencies or errors in the information.

  • Data Exploration:

Pattern Recognition: Identifying recurring patterns, themes, or trends within the qualitative data.

Coding: Categorizing and labeling specific segments of data to facilitate analysis.

  • Data Interpretation:

Meaning Making: Deriving meaning from identified patterns and themes.

Contextualization: Placing findings in the broader context to understand the significance of the data.

  • Reflexivity:

Researcher’s Role: Acknowledging and reflecting on the researcher’s influence on the analysis, as personal perspectives can shape interpretations.

  • Validity and Reliability:

Credibility: Ensuring the trustworthiness of findings through methods such as member checks or triangulation.

Consistency: Striving for reliability in the analysis process to enhance the dependability of results.

  • Report Writing:

Narrative Construction: Creating a coherent narrative that communicates the insights gained from the analysis.

Ethical Considerations: Addressing ethical concerns related to participant confidentiality and informed consent in the reporting phase.

These components collectively contribute to a rigorous and systematic qualitative data analysis, allowing researchers to uncover rich insights from non-numerical information.

Qualitative Data Analysis: Key Process Steps

Qualitative data analysis involves several key process steps to systematically make sense of non-numerical data:

  • Data Familiarization:

Immersion: Immerse yourself in the data by reading and re-reading it thoroughly to develop a comprehensive understanding.

Initial Coding: Assign preliminary codes to meaningful segments of data, capturing the essence of the content.

  • Data Organization:

Segmentation: Divide the data into manageable units, often through techniques like paragraphing or transcribing interviews.

Thematic Coding: Identify and label recurring themes or patterns that emerge, creating an initial framework for analysis.

  • Data Reduction:

Condensation: Summarize the data by extracting key information, helping to focus on the most relevant aspects.

Selective Coding: Narrow down the focus to specific themes or categories, discarding less pertinent information.

  • Data Display:

Visualization: Represent the data visually through methods like mind maps, charts, or diagrams to facilitate a clearer understanding.

Pattern Identification: Explore connections and relationships between different data elements to reveal overarching patterns.

Contextualization: Understand the context in which the data was generated, considering the broader environment and circumstances.

Constant Comparison: Continuously compare and contrast data points, refining interpretations and ensuring consistency in coding.

  • Conclusion Drawing:

Pattern Consistency: Verify that identified patterns are consistent across the dataset, enhancing the reliability of the conclusions.

Theoretical Saturation: Determine when enough data has been analyzed to reach theoretical saturation, meaning that new insights or themes are no longer emerging.

These detailed steps collectively guide researchers through the intricate process of qualitative data analysis, ensuring a rigorous and systematic exploration of non-numerical information.

Types of Qualitative Data Analysis Methods with Examples

Several qualitative data analysis methods exist, each tailored to uncover specific insights from non-numerical data. Here are some prominent types:

1. Grounded Theory:

Objective: Develop a theory grounded in the data, allowing themes and concepts to emerge without preconceived notions.

Process: Iterative cycles of data collection, coding, and theory development.

Example: Researchers conducting interviews with cancer survivors might identify themes such as “coping mechanisms” and “support networks” emerging organically, leading to the development of a grounded theory on resilience in cancer survivors.

2. Content Analysis:

Objective: Systematically analyze textual or visual content to identify patterns, themes, or trends.

Process: Coding and categorizing content based on predefined criteria or emerging themes.

Example: Analyzing news articles about climate change to identify recurring themes, such as “policy responses,” “public perception,” and “scientific consensus,” providing insights into media discourse on the topic.

3. Narrative Analysis:

Objective: Explore the stories and narratives within qualitative data to understand how individuals construct meaning.

Process: Analyzing the structure, content, and context of narratives to derive insights.

Example: Studying personal narratives of individuals who have undergone major life transitions, like migration or career changes, to understand how they construct and make sense of their experiences.

4. Phenomenological Analysis:

Objective: Explore and understand lived experiences to uncover the essence of a phenomenon.

Process: Identifying and analyzing individual experiences through in-depth interviews or observations.

Example: Exploring the lived experiences of individuals with chronic pain through in-depth interviews to uncover the essence of their experiences and how it shapes their daily lives.

5. Ethnographic Analysis:

Objective: Examine and interpret cultural practices within a specific social context.

Process: Immersive fieldwork, participant observation, and detailed description of cultural phenomena.

Example: Conducting ethnographic fieldwork in a community to understand the cultural practices, social dynamics, and everyday life of its members, providing a holistic view of the community.

6. Case Study Analysis:

Objective: In-depth exploration of a specific case or phenomenon within its real-life context.

Process: Comprehensive examination of the case through various data sources, often involving multiple methods.

Example: Examining a specific organization’s response to a crisis by analyzing internal documents, interviews with employees, and media coverage to understand the unique factors influencing the organization’s actions.

7. Constant Comparative Analysis :

Objective: Continuously compare data as it is collected and coded to refine categories and themes.

Process: Iterative comparison of data points to identify patterns and relationships.

Example: Continuously comparing interview transcripts from different participants in a study on job satisfaction to refine categories and identify commonalities and differences.

8. Framework Analysis:

Objective: Apply a structured framework to systematically organize and interpret data.

Process: Sorting, organizing, and categorizing data according to predefined themes or concepts.

Example: Applying a predefined framework to analyze focus group discussions on public health issues, categorizing responses into themes such as “awareness,” “perceived barriers,” and “suggested solutions.”

Researchers often choose a method based on the nature of their research questions, the type of data collected, and their epistemological and ontological perspectives. The selection of the appropriate method depends on the depth and richness of insights sought from the qualitative data.

Interested in learning more about the fields of product, research, and design? Search our articles here for helpful information spanning a wide range of topics!

14 Best Performance Testing Tools for Application Reliability

A complete guide to usability testing methods for better ux, ux mapping methods and how to create effective maps, a guide to the system usability scale (sus) and its scores.

Research-Methodology

Qualitative Data Analysis

Qualitative data refers to non-numeric information such as interview transcripts, notes, video and audio recordings, images and text documents. Qualitative data analysis can be divided into the following five categories:

1. Content analysis . This refers to the process of categorizing verbal or behavioural data to classify, summarize and tabulate the data.

2. Narrative analysis . This method involves the reformulation of stories presented by respondents taking into account context of each case and different experiences of each respondent. In other words, narrative analysis is the revision of primary qualitative data by researcher.

3. Discourse analysis . A method of analysis of naturally occurring talk and all types of written text.

4. Framework analysis . This is more advanced method that consists of several stages such as familiarization, identifying a thematic framework, coding, charting, mapping and interpretation.

5. Grounded theory . This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory.

Qualitative data analysis can be conducted through the following three steps:

Step 1: Developing and Applying Codes . Coding can be explained as categorization of data. A ‘code’ can be a word or a short phrase that represents a theme or an idea. All codes need to be assigned meaningful titles. A wide range of non-quantifiable elements such as events, behaviours, activities, meanings etc. can be coded.

There are three types of coding:

  • Open coding . The initial organization of raw data to try to make sense of it.
  • Axial coding . Interconnecting and linking the categories of codes.
  • Selective coding . Formulating the story through connecting the categories.

Coding can be done manually or using qualitative data analysis software such as

 NVivo,  Atlas ti 6.0,  HyperRESEARCH 2.8,  Max QDA and others.

When using manual coding you can use folders, filing cabinets, wallets etc. to gather together materials that are examples of similar themes or analytic ideas. Manual method of coding in qualitative data analysis is rightly considered as labour-intensive, time-consuming and outdated.

In computer-based coding, on the other hand, physical files and cabinets are replaced with computer based directories and files. When choosing software for qualitative data analysis you need to consider a wide range of factors such as the type and amount of data you need to analyse, time required to master the software and cost considerations.

Moreover, it is important to get confirmation from your dissertation supervisor prior to application of any specific qualitative data analysis software.

The following table contains examples of research titles, elements to be coded and identification of relevant codes:

Born or bred: revising The Great Man theory of leadership in the 21 century  

Leadership practice

Born leaders

Made leaders

Leadership effectiveness

A study into advantages and disadvantages of various entry strategies to Chinese market

 

 

 

Market entry strategies

Wholly-owned subsidiaries

Joint-ventures

Franchising

Exporting

Licensing

Impacts of CSR programs and initiative on brand image: a case study of Coca-Cola Company UK.  

 

Activities, phenomenon

Philanthropy

Supporting charitable courses

Ethical behaviour

Brand awareness

Brand value

An investigation into the ways of customer relationship management in mobile marketing environment  

 

Tactics

Viral messages

Customer retention

Popularity of social networking sites

 Qualitative data coding

Step 2: Identifying themes, patterns and relationships . Unlike quantitative methods , in qualitative data analysis there are no universally applicable techniques that can be applied to generate findings. Analytical and critical thinking skills of researcher plays significant role in data analysis in qualitative studies. Therefore, no qualitative study can be repeated to generate the same results.

Nevertheless, there is a set of techniques that you can use to identify common themes, patterns and relationships within responses of sample group members in relation to codes that have been specified in the previous stage.

Specifically, the most popular and effective methods of qualitative data interpretation include the following:

  • Word and phrase repetitions – scanning primary data for words and phrases most commonly used by respondents, as well as, words and phrases used with unusual emotions;
  • Primary and secondary data comparisons – comparing the findings of interview/focus group/observation/any other qualitative data collection method with the findings of literature review and discussing differences between them;
  • Search for missing information – discussions about which aspects of the issue was not mentioned by respondents, although you expected them to be mentioned;
  • Metaphors and analogues – comparing primary research findings to phenomena from a different area and discussing similarities and differences.

Step 3: Summarizing the data . At this last stage you need to link research findings to hypotheses or research aim and objectives. When writing data analysis chapter, you can use noteworthy quotations from the transcript in order to highlight major themes within findings and possible contradictions.

It is important to note that the process of qualitative data analysis described above is general and different types of qualitative studies may require slightly different methods of data analysis.

My  e-book,  The Ultimate Guide to Writing a Dissertation in Business Studies: a step by step approach  contains a detailed, yet simple explanation of qualitative data analysis methods . The e-book explains all stages of the research process starting from the selection of the research area to writing personal reflection. Important elements of dissertations such as research philosophy, research approach, research design, methods of data collection and data analysis are explained in simple words. John Dudovskiy

Qualitative Data Analysis

Are you an agency specialized in UX, digital marketing, or growth? Join our Partner Program

Learn / Guides / Qualitative data analysis guide

Back to guides

5 qualitative data analysis methods

Qualitative data uncovers valuable insights that help you improve the user and customer experience. But how exactly do you measure and analyze data that isn't quantifiable?

There are different qualitative data analysis methods to help you make sense of qualitative feedback and customer insights, depending on your business goals and the type of data you've collected.

Before you choose a qualitative data analysis method for your team, you need to consider the available techniques and explore their use cases to understand how each process might help you better understand your users. 

This guide covers five qualitative analysis methods to choose from, and will help you pick the right one(s) based on your goals. 

Content analysis

Thematic analysis

Narrative analysis

Grounded theory analysis

Discourse analysis

5 qualitative data analysis methods explained

Qualitative data analysis ( QDA ) is the process of organizing, analyzing, and interpreting qualitative research data—non-numeric, conceptual information, and user feedback—to capture themes and patterns, answer research questions, and identify actions to improve your product or website.

Step 1 in the research process (after planning ) is qualitative data collection. You can use behavior analytics software—like Hotjar —to capture qualitative data with context, and learn the real motivation behind user behavior, by collecting written customer feedback with Surveys or scheduling an in-depth user interview with Engage .

Use Hotjar’s tools to collect feedback, uncover behavior trends, and understand the ‘why’ behind user actions.

1. Content analysis

Content analysis is a qualitative research method that examines and quantifies the presence of certain words, subjects, and concepts in text, image, video, or audio messages. The method transforms qualitative input into quantitative data to help you make reliable conclusions about what customers think of your brand, and how you can improve their experience and opinion.

Conduct content analysis manually (which can be time-consuming) or use analysis tools like Lexalytics to reveal communication patterns, uncover differences in individual or group communication trends, and make broader connections between concepts.

#Benefits and challenges of using content analysis

How content analysis can help your team

Content analysis is often used by marketers and customer service specialists, helping them understand customer behavior and measure brand reputation.

For example, you may run a customer survey with open-ended questions to discover users’ concerns—in their own words—about their experience with your product. Instead of having to process hundreds of answers manually, a content analysis tool helps you analyze and group results based on the emotion expressed in texts.

Some other examples of content analysis include:

Analyzing brand mentions on social media to understand your brand's reputation

Reviewing customer feedback to evaluate (and then improve) the customer and user experience (UX)

Researching competitors’ website pages to identify their competitive advantages and value propositions

Interpreting customer interviews and survey results to determine user preferences, and setting the direction for new product or feature developments

Content analysis was a major part of our growth during my time at Hypercontext.

[It gave us] a better understanding of the [blog] topics that performed best for signing new users up. We were also able to go deeper within those blog posts to better understand the formats [that worked].

2. Thematic analysis

Thematic analysis helps you identify, categorize, analyze, and interpret patterns in qualitative study data , and can be done with tools like Dovetail and Thematic .

While content analysis and thematic analysis seem similar, they're different in concept: 

Content analysis can be applied to both qualitative and quantitative data , and focuses on identifying frequencies and recurring words and subjects

Thematic analysis can only be applied to qualitative data, and focuses on identifying patterns and themes

#The benefits and drawbacks of thematic analysis

How thematic analysis can help your team

Thematic analysis can be used by pretty much anyone: from product marketers, to customer relationship managers, to UX researchers.

For example, product teams use thematic analysis to better understand user behaviors and needs and improve UX . Analyzing customer feedback lets you identify themes (e.g. poor navigation or a buggy mobile interface) highlighted by users and get actionable insight into what they really expect from the product. 

💡 Pro tip: looking for a way to expedite the data analysis process for large amounts of data you collected with a survey? Try Hotjar’s AI for Surveys : along with generating a survey based on your goal in seconds, our AI will analyze the raw data and prepare an automated summary report that presents key thematic findings, respondent quotes, and actionable steps to take, making the analysis of qualitative data a breeze.

3. Narrative analysis

Narrative analysis is a method used to interpret research participants’ stories —things like testimonials , case studies, focus groups, interviews, and other text or visual data—with tools like Delve and AI-powered ATLAS.ti .

Some formats don’t work well with narrative analysis, including heavily structured interviews and written surveys, which don’t give participants as much opportunity to tell their stories in their own words.

#Benefits and challenges of narrative analysis

How narrative analysis can help your team

Narrative analysis provides product teams with valuable insight into the complexity of customers’ lives, feelings, and behaviors.

In a marketing research context, narrative analysis involves capturing and reviewing customer stories—on social media, for example—to get in-depth insight into their lives, priorities, and challenges. 

This might look like analyzing daily content shared by your audiences’ favorite influencers on Instagram, or analyzing customer reviews on sites like G2 or Capterra to gain a deep understanding of individual customer experiences. The results of this analysis also contribute to developing corresponding customer personas .

💡 Pro tip: conducting user interviews is an excellent way to collect data for narrative analysis. Though interviews can be time-intensive, there are tools out there that streamline the workload. 

Hotjar Engage automates the entire process, from recruiting to scheduling to generating the all-important interview transcripts you’ll need for the analysis phase of your research project.

4. Grounded theory analysis

Grounded theory analysis is a method of conducting qualitative research to develop theories by examining real-world data. This technique involves the creation of hypotheses and theories through qualitative data collection and evaluation, and can be performed with qualitative data analysis software tools like MAXQDA and NVivo .

Unlike other qualitative data analysis techniques, this method is inductive rather than deductive: it develops theories from data, not the other way around.

#The benefits and challenges of grounded theory analysis

How grounded theory analysis can help your team

Grounded theory analysis is used by software engineers, product marketers, managers, and other specialists who deal with data sets to make informed business decisions. 

For example, product marketing teams may turn to customer surveys to understand the reasons behind high churn rates , then use grounded theory to analyze responses and develop hypotheses about why users churn, and how you can get them to stay. 

Grounded theory can also be helpful in the talent management process. For example, HR representatives may use it to develop theories about low employee engagement, and come up with solutions based on their research findings.

5. Discourse analysis

Discourse analysis is the act of researching the underlying meaning of qualitative data. It involves the observation of texts, audio, and videos to study the relationships between information and its social context.

In contrast to content analysis, this method focuses on the contextual meaning of language: discourse analysis sheds light on what audiences think of a topic, and why they feel the way they do about it.

#Benefits and challenges of discourse analysis

How discourse analysis can help your team

In a business context, this method is primarily used by marketing teams. Discourse analysis helps marketers understand the norms and ideas in their market , and reveals why they play such a significant role for their customers. 

Once the origins of trends are uncovered, it’s easier to develop a company mission, create a unique tone of voice, and craft effective marketing messages.

Which qualitative data analysis method should you choose?

While the five qualitative data analysis methods we list above are all aimed at processing data and answering research questions, these techniques differ in their intent and the approaches applied.  

Choosing the right analysis method for your team isn't a matter of preference—selecting a method that fits is only possible once you define your research goals and have a clear intention. When you know what you need (and why you need it), you can identify an analysis method that aligns with your research objectives.

Gather qualitative data with Hotjar

Use Hotjar’s product experience insights in your qualitative research. Collect feedback, uncover behavior trends, and understand the ‘why’ behind user actions.

FAQs about qualitative data analysis methods

What is the qualitative data analysis approach.

The qualitative data analysis approach refers to the process of systematizing descriptive data collected through interviews, focus groups, surveys, and observations and then interpreting it. The methodology aims to identify patterns and themes behind textual data, and other unquantifiable data, as opposed to numerical data.

What are qualitative data analysis methods?

Five popular qualitative data analysis methods are:

What is the process of qualitative data analysis?

The process of qualitative data analysis includes six steps:

Define your research question

Prepare the data

Choose the method of qualitative analysis

Code the data

Identify themes, patterns, and relationships

Make hypotheses and act

Qualitative data analysis guide

Previous chapter

QDA challenges

Next chapter

  • Privacy Policy

Research Method

Home » Qualitative Data – Types, Methods and Examples

Qualitative Data – Types, Methods and Examples

Table of Contents

Qualitative Data

Qualitative Data

Definition:

Qualitative data is a type of data that is collected and analyzed in a non-numerical form, such as words, images, or observations. It is generally used to gain an in-depth understanding of complex phenomena, such as human behavior, attitudes, and beliefs.

Types of Qualitative Data

There are various types of qualitative data that can be collected and analyzed, including:

  • Interviews : These involve in-depth, face-to-face conversations with individuals or groups to gather their perspectives, experiences, and opinions on a particular topic.
  • Focus Groups: These are group discussions where a facilitator leads a discussion on a specific topic, allowing participants to share their views and experiences.
  • Observations : These involve observing and recording the behavior and interactions of individuals or groups in a particular setting.
  • Case Studies: These involve in-depth analysis of a particular individual, group, or organization, usually over an extended period.
  • Document Analysis : This involves examining written or recorded materials, such as newspaper articles, diaries, or public records, to gain insight into a particular topic.
  • Visual Data : This involves analyzing images or videos to understand people’s experiences or perspectives on a particular topic.
  • Online Data: This involves analyzing data collected from social media platforms, forums, or online communities to understand people’s views and opinions on a particular topic.

Qualitative Data Formats

Qualitative data can be collected and presented in various formats. Some common formats include:

  • Textual data: This includes written or transcribed data from interviews, focus groups, or observations. It can be analyzed using various techniques such as thematic analysis or content analysis.
  • Audio data: This includes recordings of interviews or focus groups, which can be transcribed and analyzed using software such as NVivo.
  • Visual data: This includes photographs, videos, or drawings, which can be analyzed using techniques such as visual analysis or semiotics.
  • Mixed media data : This includes data collected in different formats, such as audio and text. This can be analyzed using mixed methods research, which combines both qualitative and quantitative research methods.
  • Field notes: These are notes taken by researchers during observations, which can include descriptions of the setting, behaviors, and interactions of participants.

Qualitative Data Analysis Methods

Qualitative data analysis refers to the process of systematically analyzing and interpreting qualitative data to identify patterns, themes, and relationships. Here are some common methods of analyzing qualitative data:

  • Thematic analysis: This involves identifying and analyzing patterns or themes within the data. It involves coding the data into themes and subthemes and organizing them into a coherent narrative.
  • Content analysis: This involves analyzing the content of the data, such as the words, phrases, or images used. It involves identifying patterns and themes in the data and examining the relationships between them.
  • Discourse analysis: This involves analyzing the language and communication used in the data, such as the meaning behind certain words or phrases. It involves examining how the language constructs and shapes social reality.
  • Grounded theory: This involves developing a theory or framework based on the data. It involves identifying patterns and themes in the data and using them to develop a theory that explains the phenomenon being studied.
  • Narrative analysis : This involves analyzing the stories and narratives present in the data. It involves examining how the stories are constructed and how they contribute to the overall understanding of the phenomenon being studied.
  • Ethnographic analysis : This involves analyzing the culture and social practices present in the data. It involves examining how the cultural and social practices contribute to the phenomenon being studied.

Qualitative Data Collection Guide

Here are some steps to guide the collection of qualitative data:

  • Define the research question : Start by clearly defining the research question that you want to answer. This will guide the selection of data collection methods and help to ensure that the data collected is relevant to the research question.
  • Choose data collection methods : Select the most appropriate data collection methods based on the research question, the research design, and the resources available. Common methods include interviews, focus groups, observations, document analysis, and participatory research.
  • Develop a data collection plan : Develop a plan for data collection that outlines the specific procedures, timelines, and resources needed for each data collection method. This plan should include details such as how to recruit participants, how to conduct interviews or focus groups, and how to record and store data.
  • Obtain ethical approval : Obtain ethical approval from an institutional review board or ethics committee before beginning data collection. This is particularly important when working with human participants to ensure that their rights and interests are protected.
  • Recruit participants: Recruit participants based on the research question and the data collection methods chosen. This may involve purposive sampling, snowball sampling, or random sampling.
  • Collect data: Collect data using the chosen data collection methods. This may involve conducting interviews, facilitating focus groups, observing participants, or analyzing documents.
  • Transcribe and store data : Transcribe and store the data in a secure location. This may involve transcribing audio or video recordings, organizing field notes, or scanning documents.
  • Analyze data: Analyze the data using appropriate qualitative data analysis methods, such as thematic analysis or content analysis.
  • I nterpret findings : Interpret the findings of the data analysis in the context of the research question and the relevant literature. This may involve developing new theories or frameworks, or validating existing ones.
  • Communicate results: Communicate the results of the research in a clear and concise manner, using appropriate language and visual aids where necessary. This may involve writing a report, presenting at a conference, or publishing in a peer-reviewed journal.

Qualitative Data Examples

Some examples of qualitative data in different fields are as follows:

  • Sociology : In sociology, qualitative data is used to study social phenomena such as culture, norms, and social relationships. For example, a researcher might conduct interviews with members of a community to understand their beliefs and practices.
  • Psychology : In psychology, qualitative data is used to study human behavior, emotions, and attitudes. For example, a researcher might conduct a focus group to explore how individuals with anxiety cope with their symptoms.
  • Education : In education, qualitative data is used to study learning processes and educational outcomes. For example, a researcher might conduct observations in a classroom to understand how students interact with each other and with their teacher.
  • Marketing : In marketing, qualitative data is used to understand consumer behavior and preferences. For example, a researcher might conduct in-depth interviews with customers to understand their purchasing decisions.
  • Anthropology : In anthropology, qualitative data is used to study human cultures and societies. For example, a researcher might conduct participant observation in a remote community to understand their customs and traditions.
  • Health Sciences: In health sciences, qualitative data is used to study patient experiences, beliefs, and preferences. For example, a researcher might conduct interviews with cancer patients to understand how they cope with their illness.

Application of Qualitative Data

Qualitative data is used in a variety of fields and has numerous applications. Here are some common applications of qualitative data:

  • Exploratory research: Qualitative data is often used in exploratory research to understand a new or unfamiliar topic. Researchers use qualitative data to generate hypotheses and develop a deeper understanding of the research question.
  • Evaluation: Qualitative data is often used to evaluate programs or interventions. Researchers use qualitative data to understand the impact of a program or intervention on the people who participate in it.
  • Needs assessment: Qualitative data is often used in needs assessments to understand the needs of a specific population. Researchers use qualitative data to identify the most pressing needs of the population and develop strategies to address those needs.
  • Case studies: Qualitative data is often used in case studies to understand a particular case in detail. Researchers use qualitative data to understand the context, experiences, and perspectives of the people involved in the case.
  • Market research: Qualitative data is often used in market research to understand consumer behavior and preferences. Researchers use qualitative data to gain insights into consumer attitudes, opinions, and motivations.
  • Social and cultural research : Qualitative data is often used in social and cultural research to understand social phenomena such as culture, norms, and social relationships. Researchers use qualitative data to understand the experiences, beliefs, and practices of individuals and communities.

Purpose of Qualitative Data

The purpose of qualitative data is to gain a deeper understanding of social phenomena that cannot be captured by numerical or quantitative data. Qualitative data is collected through methods such as observation, interviews, and focus groups, and it provides descriptive information that can shed light on people’s experiences, beliefs, attitudes, and behaviors.

Qualitative data serves several purposes, including:

  • Generating hypotheses: Qualitative data can be used to generate hypotheses about social phenomena that can be further tested with quantitative data.
  • Providing context : Qualitative data provides a rich and detailed context for understanding social phenomena that cannot be captured by numerical data alone.
  • Exploring complex phenomena : Qualitative data can be used to explore complex phenomena such as culture, social relationships, and the experiences of marginalized groups.
  • Evaluating programs and intervention s: Qualitative data can be used to evaluate the impact of programs and interventions on the people who participate in them.
  • Enhancing understanding: Qualitative data can be used to enhance understanding of the experiences, beliefs, and attitudes of individuals and communities, which can inform policy and practice.

When to use Qualitative Data

Qualitative data is appropriate when the research question requires an in-depth understanding of complex social phenomena that cannot be captured by numerical or quantitative data.

Here are some situations when qualitative data is appropriate:

  • Exploratory research : Qualitative data is often used in exploratory research to generate hypotheses and develop a deeper understanding of a research question.
  • Understanding social phenomena : Qualitative data is appropriate when the research question requires an in-depth understanding of social phenomena such as culture, social relationships, and experiences of marginalized groups.
  • Program evaluation: Qualitative data is often used in program evaluation to understand the impact of a program on the people who participate in it.
  • Needs assessment: Qualitative data is often used in needs assessments to understand the needs of a specific population.
  • Market research: Qualitative data is often used in market research to understand consumer behavior and preferences.
  • Case studies: Qualitative data is often used in case studies to understand a particular case in detail.

Characteristics of Qualitative Data

Here are some characteristics of qualitative data:

  • Descriptive : Qualitative data provides a rich and detailed description of the social phenomena under investigation.
  • Contextual : Qualitative data is collected in the context in which the social phenomena occur, which allows for a deeper understanding of the phenomena.
  • Subjective : Qualitative data reflects the subjective experiences, beliefs, attitudes, and behaviors of the individuals and communities under investigation.
  • Flexible : Qualitative data collection methods are flexible and can be adapted to the specific needs of the research question.
  • Emergent : Qualitative data analysis is often an iterative process, where new themes and patterns emerge as the data is analyzed.
  • Interpretive : Qualitative data analysis involves interpretation of the data, which requires the researcher to be reflexive and aware of their own biases and assumptions.
  • Non-standardized: Qualitative data collection methods are often non-standardized, which means that the data is not collected in a standardized or uniform way.

Advantages of Qualitative Data

Some advantages of qualitative data are as follows:

  • Richness : Qualitative data provides a rich and detailed description of the social phenomena under investigation, allowing for a deeper understanding of the phenomena.
  • Flexibility : Qualitative data collection methods are flexible and can be adapted to the specific needs of the research question, allowing for a more nuanced exploration of social phenomena.
  • Contextualization : Qualitative data is collected in the context in which the social phenomena occur, which allows for a deeper understanding of the phenomena and their cultural and social context.
  • Subjectivity : Qualitative data reflects the subjective experiences, beliefs, attitudes, and behaviors of the individuals and communities under investigation, allowing for a more holistic understanding of the phenomena.
  • New insights : Qualitative data can generate new insights and hypotheses that can be further tested with quantitative data.
  • Participant voice : Qualitative data collection methods often involve direct participation by the individuals and communities under investigation, allowing for their voices to be heard.
  • Ethical considerations: Qualitative data collection methods often prioritize ethical considerations such as informed consent, confidentiality, and respect for the autonomy of the participants.

Limitations of Qualitative Data

Here are some limitations of qualitative data:

  • Subjectivity : Qualitative data is subjective, and the interpretation of the data depends on the researcher’s own biases, assumptions, and perspectives.
  • Small sample size: Qualitative data collection methods often involve a small sample size, which limits the generalizability of the findings.
  • Time-consuming: Qualitative data collection and analysis can be time-consuming, as it requires in-depth engagement with the data and often involves iterative processes.
  • Limited statistical analysis: Qualitative data is often not suitable for statistical analysis, which limits the ability to draw quantitative conclusions from the data.
  • Limited comparability: Qualitative data collection methods are often non-standardized, which makes it difficult to compare findings across different studies or contexts.
  • Social desirability bias : Qualitative data collection methods often rely on self-reporting by the participants, which can be influenced by social desirability bias.
  • Researcher bias: The researcher’s own biases, assumptions, and perspectives can influence the data collection and analysis, which can limit the objectivity of the findings.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Secondary Data

Secondary Data – Types, Methods and Examples

Primary Data

Primary Data – Types, Methods and Examples

Quantitative Data

Quantitative Data – Types, Methods and Examples

Research Information

Information in Research – Types and Examples

Research Data

Research Data – Types Methods and Examples

  • MS in the Learning Sciences
  • Tuition & Financial Aid

SMU Simmons School of Education & Human Development

Qualitative vs. quantitative data analysis: How do they differ?

Educator presenting data to colleagues

Learning analytics have become the cornerstone for personalizing student experiences and enhancing learning outcomes. In this data-informed approach to education there are two distinct methodologies: qualitative and quantitative analytics. These methods, which are typical to data analytics in general, are crucial to the interpretation of learning behaviors and outcomes. This blog will explore the nuances that distinguish qualitative and quantitative research, while uncovering their shared roles in learning analytics, program design and instruction.

What is qualitative data?

Qualitative data is descriptive and includes information that is non numerical. Qualitative research is used to gather in-depth insights that can't be easily measured on a scale like opinions, anecdotes and emotions. In learning analytics qualitative data could include in depth interviews, text responses to a prompt, or a video of a class period. 1

What is quantitative data?

Quantitative data is information that has a numerical value. Quantitative research is conducted to gather measurable data used in statistical analysis. Researchers can use quantitative studies to identify patterns and trends. In learning analytics quantitative data could include test scores, student demographics, or amount of time spent in a lesson. 2

Key difference between qualitative and quantitative data

It's important to understand the differences between qualitative and quantitative data to both determine the appropriate research methods for studies and to gain insights that you can be confident in sharing.

Data Types and Nature

Examples of qualitative data types in learning analytics:

  • Observational data of human behavior from classroom settings such as student engagement, teacher-student interactions, and classroom dynamics
  • Textual data from open-ended survey responses, reflective journals, and written assignments
  • Feedback and discussions from focus groups or interviews
  • Content analysis from various media

Examples of quantitative data types:

  • Standardized test, assessment, and quiz scores
  • Grades and grade point averages
  • Attendance records
  • Time spent on learning tasks
  • Data gathered from learning management systems (LMS), including login frequency, online participation, and completion rates of assignments

Methods of Collection

Qualitative and quantitative research methods for data collection can occasionally seem similar so it's important to note the differences to make sure you're creating a consistent data set and will be able to reliably draw conclusions from your data.

Qualitative research methods

Because of the nature of qualitative data (complex, detailed information), the research methods used to collect it are more involved. Qualitative researchers might do the following to collect data:

  • Conduct interviews to learn about subjective experiences
  • Host focus groups to gather feedback and personal accounts
  • Observe in-person or use audio or video recordings to record nuances of human behavior in a natural setting
  • Distribute surveys with open-ended questions

Quantitative research methods

Quantitative data collection methods are more diverse and more likely to be automated because of the objective nature of the data. A quantitative researcher could employ methods such as:

  • Surveys with close-ended questions that gather numerical data like birthdates or preferences
  • Observational research and record measurable information like the number of students in a classroom
  • Automated numerical data collection like information collected on the backend of a computer system like button clicks and page views

Analysis techniques

Qualitative and quantitative data can both be very informative. However, research studies require critical thinking for productive analysis.

Qualitative data analysis methods

Analyzing qualitative data takes a number of steps. When you first get all your data in one place you can do a review and take notes of trends you think you're seeing or your initial reactions. Next, you'll want to organize all the qualitative data you've collected by assigning it categories. Your central research question will guide your data categorization whether it's by date, location, type of collection method (interview vs focus group, etc), the specific question asked or something else. Next, you'll code your data. Whereas categorizing data is focused on the method of collection, coding is the process of identifying and labeling themes within the data collected to get closer to answering your research questions. Finally comes data interpretation. To interpret the data you'll take a look at the information gathered including your coding labels and see what results are occurring frequently or what other conclusions you can make. 3

Quantitative analysis techniques

The process to analyze quantitative data can be time-consuming due to the large volume of data possible to collect. When approaching a quantitative data set, start by focusing in on the purpose of your evaluation. Without making a conclusion, determine how you will use the information gained from analysis; for example: The answers of this survey about study habits will help determine what type of exam review session will be most useful to a class. 4

Next, you need to decide who is analyzing the data and set parameters for analysis. For example, if two different researchers are evaluating survey responses that rank preferences on a scale from 1 to 5, they need to be operating with the same understanding of the rankings. You wouldn't want one researcher to classify the value of 3 to be a positive preference while the other considers it a negative preference. It's also ideal to have some type of data management system to store and organize your data, such as a spreadsheet or database. Within the database, or via an export to data analysis software, the collected data needs to be cleaned of things like responses left blank, duplicate answers from respondents, and questions that are no longer considered relevant. Finally, you can use statistical software to analyze data (or complete a manual analysis) to find patterns and summarize your findings. 4

Qualitative and quantitative research tools

From the nuanced, thematic exploration enabled by tools like NVivo and ATLAS.ti, to the statistical precision of SPSS and R for quantitative analysis, each suite of data analysis tools offers tailored functionalities that cater to the distinct natures of different data types.

Qualitative research software:

NVivo: NVivo is qualitative data analysis software that can do everything from transcribe recordings to create word clouds and evaluate uploads for different sentiments and themes. NVivo is just one tool from the company Lumivero, which offers whole suites of data processing software. 5

ATLAS.ti: Similar to NVivo, ATLAS.ti allows researchers to upload and import data from a variety of sources to be tagged and refined using machine learning and presented with visualizations and ready for insert into reports. 6

SPSS: SPSS is a statistical analysis tool for quantitative research, appreciated for its user-friendly interface and comprehensive statistical tests, which makes it ideal for educators and researchers. With SPSS researchers can manage and analyze large quantitative data sets, use advanced statistical procedures and modeling techniques, predict customer behaviors, forecast market trends and more. 7

R: R is a versatile and dynamic open-source tool for quantitative analysis. With a vast repository of packages tailored to specific statistical methods, researchers can perform anything from basic descriptive statistics to complex predictive modeling. R is especially useful for its ability to handle large datasets, making it ideal for educational institutions that generate substantial amounts of data. The programming language offers flexibility in customizing analysis and creating publication-quality visualizations to effectively communicate results. 8

Applications in Educational Research

Both quantitative and qualitative data can be employed in learning analytics to drive informed decision-making and pedagogical enhancements. In the classroom, quantitative data like standardized test scores and online course analytics create a foundation for assessing and benchmarking student performance and engagement. Qualitative insights gathered from surveys, focus group discussions, and reflective student journals offer a more nuanced understanding of learners' experiences and contextual factors influencing their education. Additionally feedback and practical engagement metrics blend these data types, providing a holistic view that informs curriculum development, instructional strategies, and personalized learning pathways. Through these varied data sets and uses, educators can piece together a more complete narrative of student success and the impacts of educational interventions.

Master Data Analysis with an M.S. in Learning Sciences From SMU

Whether it is the detailed narratives unearthed through qualitative data or the informative patterns derived from quantitative analysis, both qualitative and quantitative data can provide crucial information for educators and researchers to better understand and improve learning. Dive deeper into the art and science of learning analytics with SMU's online Master of Science in the Learning Sciences program . At SMU, innovation and inquiry converge to empower the next generation of educators and researchers. Choose the Learning Analytics Specialization to learn how to harness the power of data science to illuminate learning trends, devise impactful strategies, and drive educational innovation. You could also find out how advanced technologies like augmented reality (AR), virtual reality (VR), and artificial intelligence (AI) can revolutionize education, and develop the insight to apply embodied cognition principles to enhance learning experiences in the Learning and Technology Design Specialization , or choose your own electives to build a specialization unique to your interests and career goals.

For more information on our curriculum and to become part of a community where data drives discovery, visit SMU's MSLS program website or schedule a call with our admissions outreach advisors for any queries or further discussion. Take the first step towards transforming education with data today.

  • Retrieved on August 8, 2024, from nnlm.gov/guides/data-glossary/qualitative-data
  • Retrieved on August 8, 2024, from nnlm.gov/guides/data-glossary/quantitative-data
  • Retrieved on August 8, 2024, from cdc.gov/healthyyouth/evaluation/pdf/brief19.pdf
  • Retrieved on August 8, 2024, from cdc.gov/healthyyouth/evaluation/pdf/brief20.pdf
  • Retrieved on August 8, 2024, from lumivero.com/solutions/
  • Retrieved on August 8, 2024, from atlasti.com/
  • Retrieved on August 8, 2024, from ibm.com/products/spss-statistics
  • Retrieved on August 8, 2024, from cran.r-project.org/doc/manuals/r-release/R-intro.html#Introduction-and-preliminaries

Return to SMU Online Learning Sciences Blog

Southern Methodist University has engaged Everspring , a leading provider of education and technology services, to support select aspects of program delivery.

This will only take a moment

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Can J Hosp Pharm
  • v.68(3); May-Jun 2015

Logo of cjhp

Qualitative Research: Data Collection, Analysis, and Management

Introduction.

In an earlier paper, 1 we presented an introduction to using qualitative research methods in pharmacy practice. In this article, we review some principles of the collection, analysis, and management of qualitative data to help pharmacists interested in doing research in their practice to continue their learning in this area. Qualitative research can help researchers to access the thoughts and feelings of research participants, which can enable development of an understanding of the meaning that people ascribe to their experiences. Whereas quantitative research methods can be used to determine how many people undertake particular behaviours, qualitative methods can help researchers to understand how and why such behaviours take place. Within the context of pharmacy practice research, qualitative approaches have been used to examine a diverse array of topics, including the perceptions of key stakeholders regarding prescribing by pharmacists and the postgraduation employment experiences of young pharmacists (see “Further Reading” section at the end of this article).

In the previous paper, 1 we outlined 3 commonly used methodologies: ethnography 2 , grounded theory 3 , and phenomenology. 4 Briefly, ethnography involves researchers using direct observation to study participants in their “real life” environment, sometimes over extended periods. Grounded theory and its later modified versions (e.g., Strauss and Corbin 5 ) use face-to-face interviews and interactions such as focus groups to explore a particular research phenomenon and may help in clarifying a less-well-understood problem, situation, or context. Phenomenology shares some features with grounded theory (such as an exploration of participants’ behaviour) and uses similar techniques to collect data, but it focuses on understanding how human beings experience their world. It gives researchers the opportunity to put themselves in another person’s shoes and to understand the subjective experiences of participants. 6 Some researchers use qualitative methodologies but adopt a different standpoint, and an example of this appears in the work of Thurston and others, 7 discussed later in this paper.

Qualitative work requires reflection on the part of researchers, both before and during the research process, as a way of providing context and understanding for readers. When being reflexive, researchers should not try to simply ignore or avoid their own biases (as this would likely be impossible); instead, reflexivity requires researchers to reflect upon and clearly articulate their position and subjectivities (world view, perspectives, biases), so that readers can better understand the filters through which questions were asked, data were gathered and analyzed, and findings were reported. From this perspective, bias and subjectivity are not inherently negative but they are unavoidable; as a result, it is best that they be articulated up-front in a manner that is clear and coherent for readers.

THE PARTICIPANT’S VIEWPOINT

What qualitative study seeks to convey is why people have thoughts and feelings that might affect the way they behave. Such study may occur in any number of contexts, but here, we focus on pharmacy practice and the way people behave with regard to medicines use (e.g., to understand patients’ reasons for nonadherence with medication therapy or to explore physicians’ resistance to pharmacists’ clinical suggestions). As we suggested in our earlier article, 1 an important point about qualitative research is that there is no attempt to generalize the findings to a wider population. Qualitative research is used to gain insights into people’s feelings and thoughts, which may provide the basis for a future stand-alone qualitative study or may help researchers to map out survey instruments for use in a quantitative study. It is also possible to use different types of research in the same study, an approach known as “mixed methods” research, and further reading on this topic may be found at the end of this paper.

The role of the researcher in qualitative research is to attempt to access the thoughts and feelings of study participants. This is not an easy task, as it involves asking people to talk about things that may be very personal to them. Sometimes the experiences being explored are fresh in the participant’s mind, whereas on other occasions reliving past experiences may be difficult. However the data are being collected, a primary responsibility of the researcher is to safeguard participants and their data. Mechanisms for such safeguarding must be clearly articulated to participants and must be approved by a relevant research ethics review board before the research begins. Researchers and practitioners new to qualitative research should seek advice from an experienced qualitative researcher before embarking on their project.

DATA COLLECTION

Whatever philosophical standpoint the researcher is taking and whatever the data collection method (e.g., focus group, one-to-one interviews), the process will involve the generation of large amounts of data. In addition to the variety of study methodologies available, there are also different ways of making a record of what is said and done during an interview or focus group, such as taking handwritten notes or video-recording. If the researcher is audio- or video-recording data collection, then the recordings must be transcribed verbatim before data analysis can begin. As a rough guide, it can take an experienced researcher/transcriber 8 hours to transcribe one 45-minute audio-recorded interview, a process than will generate 20–30 pages of written dialogue.

Many researchers will also maintain a folder of “field notes” to complement audio-taped interviews. Field notes allow the researcher to maintain and comment upon impressions, environmental contexts, behaviours, and nonverbal cues that may not be adequately captured through the audio-recording; they are typically handwritten in a small notebook at the same time the interview takes place. Field notes can provide important context to the interpretation of audio-taped data and can help remind the researcher of situational factors that may be important during data analysis. Such notes need not be formal, but they should be maintained and secured in a similar manner to audio tapes and transcripts, as they contain sensitive information and are relevant to the research. For more information about collecting qualitative data, please see the “Further Reading” section at the end of this paper.

DATA ANALYSIS AND MANAGEMENT

If, as suggested earlier, doing qualitative research is about putting oneself in another person’s shoes and seeing the world from that person’s perspective, the most important part of data analysis and management is to be true to the participants. It is their voices that the researcher is trying to hear, so that they can be interpreted and reported on for others to read and learn from. To illustrate this point, consider the anonymized transcript excerpt presented in Appendix 1 , which is taken from a research interview conducted by one of the authors (J.S.). We refer to this excerpt throughout the remainder of this paper to illustrate how data can be managed, analyzed, and presented.

Interpretation of Data

Interpretation of the data will depend on the theoretical standpoint taken by researchers. For example, the title of the research report by Thurston and others, 7 “Discordant indigenous and provider frames explain challenges in improving access to arthritis care: a qualitative study using constructivist grounded theory,” indicates at least 2 theoretical standpoints. The first is the culture of the indigenous population of Canada and the place of this population in society, and the second is the social constructivist theory used in the constructivist grounded theory method. With regard to the first standpoint, it can be surmised that, to have decided to conduct the research, the researchers must have felt that there was anecdotal evidence of differences in access to arthritis care for patients from indigenous and non-indigenous backgrounds. With regard to the second standpoint, it can be surmised that the researchers used social constructivist theory because it assumes that behaviour is socially constructed; in other words, people do things because of the expectations of those in their personal world or in the wider society in which they live. (Please see the “Further Reading” section for resources providing more information about social constructivist theory and reflexivity.) Thus, these 2 standpoints (and there may have been others relevant to the research of Thurston and others 7 ) will have affected the way in which these researchers interpreted the experiences of the indigenous population participants and those providing their care. Another standpoint is feminist standpoint theory which, among other things, focuses on marginalized groups in society. Such theories are helpful to researchers, as they enable us to think about things from a different perspective. Being aware of the standpoints you are taking in your own research is one of the foundations of qualitative work. Without such awareness, it is easy to slip into interpreting other people’s narratives from your own viewpoint, rather than that of the participants.

To analyze the example in Appendix 1 , we will adopt a phenomenological approach because we want to understand how the participant experienced the illness and we want to try to see the experience from that person’s perspective. It is important for the researcher to reflect upon and articulate his or her starting point for such analysis; for example, in the example, the coder could reflect upon her own experience as a female of a majority ethnocultural group who has lived within middle class and upper middle class settings. This personal history therefore forms the filter through which the data will be examined. This filter does not diminish the quality or significance of the analysis, since every researcher has his or her own filters; however, by explicitly stating and acknowledging what these filters are, the researcher makes it easer for readers to contextualize the work.

Transcribing and Checking

For the purposes of this paper it is assumed that interviews or focus groups have been audio-recorded. As mentioned above, transcribing is an arduous process, even for the most experienced transcribers, but it must be done to convert the spoken word to the written word to facilitate analysis. For anyone new to conducting qualitative research, it is beneficial to transcribe at least one interview and one focus group. It is only by doing this that researchers realize how difficult the task is, and this realization affects their expectations when asking others to transcribe. If the research project has sufficient funding, then a professional transcriber can be hired to do the work. If this is the case, then it is a good idea to sit down with the transcriber, if possible, and talk through the research and what the participants were talking about. This background knowledge for the transcriber is especially important in research in which people are using jargon or medical terms (as in pharmacy practice). Involving your transcriber in this way makes the work both easier and more rewarding, as he or she will feel part of the team. Transcription editing software is also available, but it is expensive. For example, ELAN (more formally known as EUDICO Linguistic Annotator, developed at the Technical University of Berlin) 8 is a tool that can help keep data organized by linking media and data files (particularly valuable if, for example, video-taping of interviews is complemented by transcriptions). It can also be helpful in searching complex data sets. Products such as ELAN do not actually automatically transcribe interviews or complete analyses, and they do require some time and effort to learn; nonetheless, for some research applications, it may be a valuable to consider such software tools.

All audio recordings should be transcribed verbatim, regardless of how intelligible the transcript may be when it is read back. Lines of text should be numbered. Once the transcription is complete, the researcher should read it while listening to the recording and do the following: correct any spelling or other errors; anonymize the transcript so that the participant cannot be identified from anything that is said (e.g., names, places, significant events); insert notations for pauses, laughter, looks of discomfort; insert any punctuation, such as commas and full stops (periods) (see Appendix 1 for examples of inserted punctuation), and include any other contextual information that might have affected the participant (e.g., temperature or comfort of the room).

Dealing with the transcription of a focus group is slightly more difficult, as multiple voices are involved. One way of transcribing such data is to “tag” each voice (e.g., Voice A, Voice B). In addition, the focus group will usually have 2 facilitators, whose respective roles will help in making sense of the data. While one facilitator guides participants through the topic, the other can make notes about context and group dynamics. More information about group dynamics and focus groups can be found in resources listed in the “Further Reading” section.

Reading between the Lines

During the process outlined above, the researcher can begin to get a feel for the participant’s experience of the phenomenon in question and can start to think about things that could be pursued in subsequent interviews or focus groups (if appropriate). In this way, one participant’s narrative informs the next, and the researcher can continue to interview until nothing new is being heard or, as it says in the text books, “saturation is reached”. While continuing with the processes of coding and theming (described in the next 2 sections), it is important to consider not just what the person is saying but also what they are not saying. For example, is a lengthy pause an indication that the participant is finding the subject difficult, or is the person simply deciding what to say? The aim of the whole process from data collection to presentation is to tell the participants’ stories using exemplars from their own narratives, thus grounding the research findings in the participants’ lived experiences.

Smith 9 suggested a qualitative research method known as interpretative phenomenological analysis, which has 2 basic tenets: first, that it is rooted in phenomenology, attempting to understand the meaning that individuals ascribe to their lived experiences, and second, that the researcher must attempt to interpret this meaning in the context of the research. That the researcher has some knowledge and expertise in the subject of the research means that he or she can have considerable scope in interpreting the participant’s experiences. Larkin and others 10 discussed the importance of not just providing a description of what participants say. Rather, interpretative phenomenological analysis is about getting underneath what a person is saying to try to truly understand the world from his or her perspective.

Once all of the research interviews have been transcribed and checked, it is time to begin coding. Field notes compiled during an interview can be a useful complementary source of information to facilitate this process, as the gap in time between an interview, transcribing, and coding can result in memory bias regarding nonverbal or environmental context issues that may affect interpretation of data.

Coding refers to the identification of topics, issues, similarities, and differences that are revealed through the participants’ narratives and interpreted by the researcher. This process enables the researcher to begin to understand the world from each participant’s perspective. Coding can be done by hand on a hard copy of the transcript, by making notes in the margin or by highlighting and naming sections of text. More commonly, researchers use qualitative research software (e.g., NVivo, QSR International Pty Ltd; www.qsrinternational.com/products_nvivo.aspx ) to help manage their transcriptions. It is advised that researchers undertake a formal course in the use of such software or seek supervision from a researcher experienced in these tools.

Returning to Appendix 1 and reading from lines 8–11, a code for this section might be “diagnosis of mental health condition”, but this would just be a description of what the participant is talking about at that point. If we read a little more deeply, we can ask ourselves how the participant might have come to feel that the doctor assumed he or she was aware of the diagnosis or indeed that they had only just been told the diagnosis. There are a number of pauses in the narrative that might suggest the participant is finding it difficult to recall that experience. Later in the text, the participant says “nobody asked me any questions about my life” (line 19). This could be coded simply as “health care professionals’ consultation skills”, but that would not reflect how the participant must have felt never to be asked anything about his or her personal life, about the participant as a human being. At the end of this excerpt, the participant just trails off, recalling that no-one showed any interest, which makes for very moving reading. For practitioners in pharmacy, it might also be pertinent to explore the participant’s experience of akathisia and why this was left untreated for 20 years.

One of the questions that arises about qualitative research relates to the reliability of the interpretation and representation of the participants’ narratives. There are no statistical tests that can be used to check reliability and validity as there are in quantitative research. However, work by Lincoln and Guba 11 suggests that there are other ways to “establish confidence in the ‘truth’ of the findings” (p. 218). They call this confidence “trustworthiness” and suggest that there are 4 criteria of trustworthiness: credibility (confidence in the “truth” of the findings), transferability (showing that the findings have applicability in other contexts), dependability (showing that the findings are consistent and could be repeated), and confirmability (the extent to which the findings of a study are shaped by the respondents and not researcher bias, motivation, or interest).

One way of establishing the “credibility” of the coding is to ask another researcher to code the same transcript and then to discuss any similarities and differences in the 2 resulting sets of codes. This simple act can result in revisions to the codes and can help to clarify and confirm the research findings.

Theming refers to the drawing together of codes from one or more transcripts to present the findings of qualitative research in a coherent and meaningful way. For example, there may be examples across participants’ narratives of the way in which they were treated in hospital, such as “not being listened to” or “lack of interest in personal experiences” (see Appendix 1 ). These may be drawn together as a theme running through the narratives that could be named “the patient’s experience of hospital care”. The importance of going through this process is that at its conclusion, it will be possible to present the data from the interviews using quotations from the individual transcripts to illustrate the source of the researchers’ interpretations. Thus, when the findings are organized for presentation, each theme can become the heading of a section in the report or presentation. Underneath each theme will be the codes, examples from the transcripts, and the researcher’s own interpretation of what the themes mean. Implications for real life (e.g., the treatment of people with chronic mental health problems) should also be given.

DATA SYNTHESIS

In this final section of this paper, we describe some ways of drawing together or “synthesizing” research findings to represent, as faithfully as possible, the meaning that participants ascribe to their life experiences. This synthesis is the aim of the final stage of qualitative research. For most readers, the synthesis of data presented by the researcher is of crucial significance—this is usually where “the story” of the participants can be distilled, summarized, and told in a manner that is both respectful to those participants and meaningful to readers. There are a number of ways in which researchers can synthesize and present their findings, but any conclusions drawn by the researchers must be supported by direct quotations from the participants. In this way, it is made clear to the reader that the themes under discussion have emerged from the participants’ interviews and not the mind of the researcher. The work of Latif and others 12 gives an example of how qualitative research findings might be presented.

Planning and Writing the Report

As has been suggested above, if researchers code and theme their material appropriately, they will naturally find the headings for sections of their report. Qualitative researchers tend to report “findings” rather than “results”, as the latter term typically implies that the data have come from a quantitative source. The final presentation of the research will usually be in the form of a report or a paper and so should follow accepted academic guidelines. In particular, the article should begin with an introduction, including a literature review and rationale for the research. There should be a section on the chosen methodology and a brief discussion about why qualitative methodology was most appropriate for the study question and why one particular methodology (e.g., interpretative phenomenological analysis rather than grounded theory) was selected to guide the research. The method itself should then be described, including ethics approval, choice of participants, mode of recruitment, and method of data collection (e.g., semistructured interviews or focus groups), followed by the research findings, which will be the main body of the report or paper. The findings should be written as if a story is being told; as such, it is not necessary to have a lengthy discussion section at the end. This is because much of the discussion will take place around the participants’ quotes, such that all that is needed to close the report or paper is a summary, limitations of the research, and the implications that the research has for practice. As stated earlier, it is not the intention of qualitative research to allow the findings to be generalized, and therefore this is not, in itself, a limitation.

Planning out the way that findings are to be presented is helpful. It is useful to insert the headings of the sections (the themes) and then make a note of the codes that exemplify the thoughts and feelings of your participants. It is generally advisable to put in the quotations that you want to use for each theme, using each quotation only once. After all this is done, the telling of the story can begin as you give your voice to the experiences of the participants, writing around their quotations. Do not be afraid to draw assumptions from the participants’ narratives, as this is necessary to give an in-depth account of the phenomena in question. Discuss these assumptions, drawing on your participants’ words to support you as you move from one code to another and from one theme to the next. Finally, as appropriate, it is possible to include examples from literature or policy documents that add support for your findings. As an exercise, you may wish to code and theme the sample excerpt in Appendix 1 and tell the participant’s story in your own way. Further reading about “doing” qualitative research can be found at the end of this paper.

CONCLUSIONS

Qualitative research can help researchers to access the thoughts and feelings of research participants, which can enable development of an understanding of the meaning that people ascribe to their experiences. It can be used in pharmacy practice research to explore how patients feel about their health and their treatment. Qualitative research has been used by pharmacists to explore a variety of questions and problems (see the “Further Reading” section for examples). An understanding of these issues can help pharmacists and other health care professionals to tailor health care to match the individual needs of patients and to develop a concordant relationship. Doing qualitative research is not easy and may require a complete rethink of how research is conducted, particularly for researchers who are more familiar with quantitative approaches. There are many ways of conducting qualitative research, and this paper has covered some of the practical issues regarding data collection, analysis, and management. Further reading around the subject will be essential to truly understand this method of accessing peoples’ thoughts and feelings to enable researchers to tell participants’ stories.

Appendix 1. Excerpt from a sample transcript

The participant (age late 50s) had suffered from a chronic mental health illness for 30 years. The participant had become a “revolving door patient,” someone who is frequently in and out of hospital. As the participant talked about past experiences, the researcher asked:

  • What was treatment like 30 years ago?
  • Umm—well it was pretty much they could do what they wanted with you because I was put into the er, the er kind of system er, I was just on
  • endless section threes.
  • Really…
  • But what I didn’t realize until later was that if you haven’t actually posed a threat to someone or yourself they can’t really do that but I didn’t know
  • that. So wh-when I first went into hospital they put me on the forensic ward ’cause they said, “We don’t think you’ll stay here we think you’ll just
  • run-run away.” So they put me then onto the acute admissions ward and – er – I can remember one of the first things I recall when I got onto that
  • ward was sitting down with a er a Dr XXX. He had a book this thick [gestures] and on each page it was like three questions and he went through
  • all these questions and I answered all these questions. So we’re there for I don’t maybe two hours doing all that and he asked me he said “well
  • when did somebody tell you then that you have schizophrenia” I said “well nobody’s told me that” so he seemed very surprised but nobody had
  • actually [pause] whe-when I first went up there under police escort erm the senior kind of consultants people I’d been to where I was staying and
  • ermm so er [pause] I . . . the, I can remember the very first night that I was there and given this injection in this muscle here [gestures] and just
  • having dreadful side effects the next day I woke up [pause]
  • . . . and I suffered that akathesia I swear to you, every minute of every day for about 20 years.
  • Oh how awful.
  • And that side of it just makes life impossible so the care on the wards [pause] umm I don’t know it’s kind of, it’s kind of hard to put into words
  • [pause]. Because I’m not saying they were sort of like not friendly or interested but then nobody ever seemed to want to talk about your life [pause]
  • nobody asked me any questions about my life. The only questions that came into was they asked me if I’d be a volunteer for these student exams
  • and things and I said “yeah” so all the questions were like “oh what jobs have you done,” er about your relationships and things and er but
  • nobody actually sat down and had a talk and showed some interest in you as a person you were just there basically [pause] um labelled and you
  • know there was there was [pause] but umm [pause] yeah . . .

This article is the 10th in the CJHP Research Primer Series, an initiative of the CJHP Editorial Board and the CSHP Research Committee. The planned 2-year series is intended to appeal to relatively inexperienced researchers, with the goal of building research capacity among practising pharmacists. The articles, presenting simple but rigorous guidance to encourage and support novice researchers, are being solicited from authors with appropriate expertise.

Previous articles in this series:

Bond CM. The research jigsaw: how to get started. Can J Hosp Pharm . 2014;67(1):28–30.

Tully MP. Research: articulating questions, generating hypotheses, and choosing study designs. Can J Hosp Pharm . 2014;67(1):31–4.

Loewen P. Ethical issues in pharmacy practice research: an introductory guide. Can J Hosp Pharm. 2014;67(2):133–7.

Tsuyuki RT. Designing pharmacy practice research trials. Can J Hosp Pharm . 2014;67(3):226–9.

Bresee LC. An introduction to developing surveys for pharmacy practice research. Can J Hosp Pharm . 2014;67(4):286–91.

Gamble JM. An introduction to the fundamentals of cohort and case–control studies. Can J Hosp Pharm . 2014;67(5):366–72.

Austin Z, Sutton J. Qualitative research: getting started. C an J Hosp Pharm . 2014;67(6):436–40.

Houle S. An introduction to the fundamentals of randomized controlled trials in pharmacy research. Can J Hosp Pharm . 2014; 68(1):28–32.

Charrois TL. Systematic reviews: What do you need to know to get started? Can J Hosp Pharm . 2014;68(2):144–8.

Competing interests: None declared.

Further Reading

Examples of qualitative research in pharmacy practice.

  • Farrell B, Pottie K, Woodend K, Yao V, Dolovich L, Kennie N, et al. Shifts in expectations: evaluating physicians’ perceptions as pharmacists integrated into family practice. J Interprof Care. 2010; 24 (1):80–9. [ PubMed ] [ Google Scholar ]
  • Gregory P, Austin Z. Postgraduation employment experiences of new pharmacists in Ontario in 2012–2013. Can Pharm J. 2014; 147 (5):290–9. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Marks PZ, Jennnings B, Farrell B, Kennie-Kaulbach N, Jorgenson D, Pearson-Sharpe J, et al. “I gained a skill and a change in attitude”: a case study describing how an online continuing professional education course for pharmacists supported achievement of its transfer to practice outcomes. Can J Univ Contin Educ. 2014; 40 (2):1–18. [ Google Scholar ]
  • Nair KM, Dolovich L, Brazil K, Raina P. It’s all about relationships: a qualitative study of health researchers’ perspectives on interdisciplinary research. BMC Health Serv Res. 2008; 8 :110. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Pojskic N, MacKeigan L, Boon H, Austin Z. Initial perceptions of key stakeholders in Ontario regarding independent prescriptive authority for pharmacists. Res Soc Adm Pharm. 2014; 10 (2):341–54. [ PubMed ] [ Google Scholar ]

Qualitative Research in General

  • Breakwell GM, Hammond S, Fife-Schaw C. Research methods in psychology. Thousand Oaks (CA): Sage Publications; 1995. [ Google Scholar ]
  • Given LM. 100 questions (and answers) about qualitative research. Thousand Oaks (CA): Sage Publications; 2015. [ Google Scholar ]
  • Miles B, Huberman AM. Qualitative data analysis. Thousand Oaks (CA): Sage Publications; 2009. [ Google Scholar ]
  • Patton M. Qualitative research and evaluation methods. Thousand Oaks (CA): Sage Publications; 2002. [ Google Scholar ]
  • Willig C. Introducing qualitative research in psychology. Buckingham (UK): Open University Press; 2001. [ Google Scholar ]

Group Dynamics in Focus Groups

  • Farnsworth J, Boon B. Analysing group dynamics within the focus group. Qual Res. 2010; 10 (5):605–24. [ Google Scholar ]

Social Constructivism

  • Social constructivism. Berkeley (CA): University of California, Berkeley, Berkeley Graduate Division, Graduate Student Instruction Teaching & Resource Center; [cited 2015 June 4]. Available from: http://gsi.berkeley.edu/gsi-guide-contents/learning-theory-research/social-constructivism/ [ Google Scholar ]

Mixed Methods

  • Creswell J. Research design: qualitative, quantitative, and mixed methods approaches. Thousand Oaks (CA): Sage Publications; 2009. [ Google Scholar ]

Collecting Qualitative Data

  • Arksey H, Knight P. Interviewing for social scientists: an introductory resource with examples. Thousand Oaks (CA): Sage Publications; 1999. [ Google Scholar ]
  • Guest G, Namey EE, Mitchel ML. Collecting qualitative data: a field manual for applied research. Thousand Oaks (CA): Sage Publications; 2013. [ Google Scholar ]

Constructivist Grounded Theory

  • Charmaz K. Grounded theory: objectivist and constructivist methods. In: Denzin N, Lincoln Y, editors. Handbook of qualitative research. 2nd ed. Thousand Oaks (CA): Sage Publications; 2000. pp. 509–35. [ Google Scholar ]
  • Open access
  • Published: 22 August 2024

Factors influencing fidelity to guideline implementation strategies for improving pain care at cancer centres: a qualitative sub-study of the Stop Cancer PAIN Trial

  • Tim Luckett 1 ,
  • Jane Phillips 2 ,
  • Meera Agar 1 , 3 ,
  • Linda Richards 4 ,
  • Najwa Reynolds 5 ,
  • Maja Garcia 1 ,
  • Patricia Davidson 6 ,
  • Tim Shaw 7 ,
  • David Currow 6 ,
  • Frances Boyle 8 , 9 ,
  • Lawrence Lam 10 ,
  • Nikki McCaffrey 11 &
  • Melanie Lovell 5 , 9  

BMC Health Services Research volume  24 , Article number:  969 ( 2024 ) Cite this article

Metrics details

The Stop Cancer PAIN Trial was a phase III pragmatic stepped wedge cluster randomised controlled trial which compared effectiveness of screening and guidelines with or without implementation strategies for improving pain in adults with cancer attending six Australian outpatient comprehensive cancer centres ( n  = 688). A system for pain screening was introduced before observation of a ‘control’ phase. Implementation strategies introduced in the ‘intervention’ phase included: (1) audit of adherence to guideline recommendations, with feedback to clinical teams; (2) health professional education via an email-administered ‘spaced education’ module; and (3) a patient education booklet and self-management resource. Selection of strategies was informed by the Capability, Opportunity and Motivation Behaviour (COM-B) Model (Michie et al., 2011) and evidence for each strategy’s stand-alone effectiveness. A consultant physician at each centre supported the intervention as a ‘clinical champion’. However, fidelity to the intervention was limited, and the Trial did not demonstrate effectiveness. This paper reports a sub-study of the Trial which aimed to identify factors inhibiting or enabling fidelity to inform future guideline implementation initiatives.

The qualitative sub-study enabled in-depth exploration of factors from the perspectives of personnel at each centre. Clinical champions, clinicians and clinic receptionists were invited to participate in semi-structured interviews. Analysis used a framework method and a largely deductive approach based on the COM-B Model.

Twenty-four people participated, including 15 physicians, 8 nurses and 1 clinic receptionist. Coding against the COM-B Model identified ‘capability’ to be the most influential component, with ‘opportunity’ and ‘motivation’ playing largely subsidiary roles. Findings suggest that fidelity could have been improved by: considering the readiness for change of each clinical setting; better articulating the intervention’s value proposition; defining clinician roles and responsibilities, addressing perceptions that pain care falls beyond oncology clinicians’ scopes of practice; integrating the intervention within existing systems and processes; promoting patient-clinician partnerships; investing in clinical champions among senior nursing and junior medical personnel, supported by medical leaders; and planning for slow incremental change rather than rapid uptake.

Conclusions

Future guideline implementation interventions may require a ‘meta-implementation’ approach based on complex systems theory to successfully integrate multiple strategies.

Trial registration

Registry: Australian New Zealand Clinical Trials Registry; number: ACTRN 12615000064505; data: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspxid=367236&isReview=true .

Peer Review reports

Pain is a common and burdensome symptom in people with cancer [ 1 ]. Barriers to pain care occur at all ‘levels’, including the patient and family (e.g., misconceptions regarding opioids), clinician (e.g. lack of expertise), service (e.g. inadequate referral processes) and healthcare system (e.g. lack of coordination) [ 2 , 3 , 4 , 5 , 6 , 7 , 8 ]. A recent systematic review suggests that around 40% of cancer patients with pain may not receive adequate management [ 9 ]. Research has demonstrated that routine screening and implementation of evidence-based guidelines has potential to improve quality of cancer pain care and outcomes [ 10 , 11 , 12 , 13 , 14 ]. However, experience suggests that clinicians are unlikely to utilise screening results or follow guidelines unless these are supported by targeted strategies [ 15 , 16 ].

The Stop Cancer PAIN Trial (ACTRN 12615000064505) was a phase III pragmatic stepped wedge cluster randomised controlled trial conducted between 2014 and 2019 which compared the effectiveness of screening and guidelines with or without implementation strategies for improving pain in adults with cancer attending six outpatient comprehensive cancer centres in Australia ( n  = 688) [ 17 , 18 ]. A pen/paper system to screen for pain using 0–10 numerical rating scales (NRS) for worst and average intensity over the past 24 h was introduced to each centre prior to observation of a ‘control’ phase, in which clinicians were also made aware of the Australian Cancer Pain Management in Adults guidelines [ 19 ]. At the beginning of the training phase, trial investigators presented at staff meetings on the importance of better managing pain and the rationale and evidence base for the intervention components. Implementation strategies (collectively termed the ‘intervention’) were then introduced in a ‘training’ phase and maintained during an ‘intervention’ phase as follows: (1) audit of adherence to key guideline recommendations [ 19 ] and feedback delivered to clinical teams in one or two cycles; (2) health professional education via a ‘Qstream’ email-administered ‘spaced education’ module [ 20 ]; and (3) a patient education booklet and self-management resource for completion together with a clinician that included goal setting, a pain diary and pain management plan [ 21 , 22 ]. Selection of these strategies was informed by the Capability, Opportunity and Motivation Behaviour (COM-B) Model of behaviour change [ 23 ], and evidence that each strategy had been separately effective for supporting guideline implementation for other health conditions. The intervention was supported at each centre by a consultant physician who agreed to be a ‘clinical champion’ [ 24 ].

As reported previously [ 18 ], the Stop Cancer PAIN Trial found no significant differences between the intervention and the control phases on the trial’s primary outcome - the proportion of patients with moderate-severe worst pain intensity who reported a 30% decrease at 1-week follow-up. Fidelity to the intervention was lower than anticipated and variable between centres: only 2/6 centres had two audit cycles rather than one; completion rates for the health professional spaced education varied from 12% to 74% between centres; and the proportion of patients reporting receipt of written information of any kind rose to an average of only 30% (20-44%) versus 22% (2-30%) in the control phase. Unexpectedly, secondary measures of mean, worst and average pain over a 4-week follow-up period improved by 0.5 standard deviation during control as well as intervention phases. However, the lack of a comparison group with no screening system made it difficult to conclude whether improvement in the control phase was due to effects from screening, a Hawthorne effect, or some other explanation.

The current paper reports a sub-study of the Stop Cancer PAIN Trial which aimed to identify factors influencing fidelity to the intervention that might warrant consideration by similar initiatives in the future.

The intervention, methods and results of the Stop Cancer PAIN trial have been described in previous open-access articles [ 17 , 18 ]. The sub-study used a qualitative approach with pragmatic orientation to enable in-depth exploration of factors influencing success from the perspectives of clinicians at each participating centre [ 25 ]. Clinician views canvassed at interview were considered the most efficient means of identifying barriers and enablers among complex contextual factors at each centre, including personnel’s knowledge, attitudes and beliefs towards pain care and the intervention.

The sub-study was approved by the Southwestern Sydney Local Health District Human Research Ethics Committee (HREC/14/LPOOL/479) as part of the overall trial. All participants gave written informed consent to participate.

Reporting adheres to the consolidated criteria for reporting qualitative research (COREQ) [ 26 ].

Participants

Participants were eligible if they were employed on a permanent basis either full- or part-time at a participating centre in a role that provided clinical care to cancer patients or patient-focused administrative support. The clinical champion at each centre was invited to participate by the research team. Other personnel were invited by means of email circulars and verbal invitations during meetings. Given the diverse range of roles at each centre, no limit was set on sample size to canvass as many perspectives as possible.

Data collection

Data were collected by means of semi-structured interviews conducted by one of two researchers, a female pharmacist with experience of medical education for pain management (LR), and a male social scientist with a doctorate (TL). Both interviewers had prior experience in qualitative research and knew some participants through their project roles.

Participants were fully aware of the study purpose before consenting. Interviews were conducted face-to-face or by telephone, with the participant and interviewer being the only people present. Interviews began with open questions about ‘what worked’ and ‘didn’t work’ across the intervention before focusing on each implementation strategy in more detail and important contextual factors at their centre (see Table  1 for a topic guide, which was developed specifically for this study). Interviewers explicitly invited criticism, expressing a tone of open enquiry and neutrality throughout. Prompts were used as necessary to explore factors identified by participants in more detail. Factors identified at previous interviews were raised at subsequent ones for verification, inviting participants to disagree or agree as they felt appropriate. No requests were received to return transcripts to participants for comment. Interviews were audio-recorded and transcribed verbatim.

Analysis used the framework method [ 27 ] and a largely deductive approach based on the same theoretical framework used during intervention design - the COM-B Model [ 23 ]. Based on a systematic review, the COM-B Model posits that behaviour change requires three conditions, namely ‘capability’ (including both psychological and physical capacity), ‘opportunity’ (all the factors that lie outside the individual that make the behaviour possible or prompt it) and ‘motivation’ (including habitual processes, emotional responding, as well as analytical decision-making). Initial line-by-line coding categorized data against these conditions according to which best described relationships between factors and behaviours within and across implementation strategies and the levels of patient, clinician and centre. While the COM-B model originally focused directly on human behaviour, it became clear during coding that behaviour was substantially influenced by centre, specialty and disciplinary factors, so these were also considered appropriate foci for coding against COM conditions. To enhance credibility, the same data were coded in different ways where multiple interpretations seemed plausible until coding of further interviews identified consistencies to help with disambiguation. Charting of codes for data within and between centres enabled mapping between the relative contributions made by each condition, summarised as lessons learned for guiding similar initiatives in the future. Dependability was increased by ensuring coding was conducted by two members of the research team (NR, MG) who had no previous involvement in the project but were experienced in qualitative research. Review and discussion with two team members who were involved in the project throughout (TL and ML) was intended to balance ‘outsider’ and ‘insider’ perspectives to guard against bias from preconceived interpretations whilst also referencing contextual understanding. Both Excel 2019 (Microsoft) and NVivo V12 (QSR) software were used to help manage different stages of the analytic process.

Twenty-four people participated across the six centres, ranging from one to six participants. Fifteen were physicians (of whom six were clinical champions), eight were nurses, and one was a clinic receptionist. This response rate ranged from 2 to 27% of eligible personnel at each centre. See Table  2 for a more detailed summary of participant roles at each centre. Interviews were a median of 20 min long, with an inter-quartile range of 13 to 28 min.

Capability, opportunity and motivation

Coding against the COM-B Model identified ‘capability’ to be the component having most influence over intervention success, with ‘opportunity’ and ‘motivation’ playing largely subsidiary roles.

Capabilities: Pertinent capabilities were reported to include: a pre-existing, centre-level culture of continuous improvement, communication pathways between senior management and other personnel, established roles and responsibilities for pain care among disciplines and specialties, systems and processes that could readily accommodate the intervention, and a culture of involving patients as partners in care. These capabilities influenced the degree to which personnel and patients had the opportunity and motivation to fully engage with the intervention.

Opportunity and motivation: These elements were most frequently discussed by participants in terms of ‘time’ that personnel could commit to pain care relative to other responsibilities. Clinical champions were perceived to play a critical role in supporting intervention success but were under-resourced at every centre and challenged by turnover in the role at two. In addition to more systemic drivers, individual personnel’s motivation was influenced by the degree to which they accepted the intervention’s value proposition at the outset and perceived this to be demonstrated over time.

Interactions between capability, opportunity and motivation are explored below in terms of their implications for similar future initiatives. Findings suggest that fidelity could have been improved by: considering the readiness for change of each clinical setting; better articulating the intervention’s value proposition; defining clinician roles and responsibilities, addressing perceptions that pain care falls beyond oncology clinicians’ scopes of practice; integrating the intervention within existing systems and processes; promoting patient-clinician partnerships; investing in clinical champions among senior nursing and junior medical personnel, supported by medical leaders; and planning for slow incremental change rather than rapid uptake.

Consider centres’ readiness for change

The degree to which centres had a pre-existing culture of continuous improvement was considered important in providing a fertile context for the intervention. At Centre 5, there was a consensus that change of any kind was difficult to instigate, even according to the head of department: “… because it’s new - because we’re so entrenched in our ways ” (C5P04 [Centre 5, participant 04] medical oncologist, head of department and clinical champion). At another, the complex centre-level nature of the intervention was perceived to pose particular challenges compared to oncology drug trials with which they were more familiar: “ we haven’t been a principal site [in a trial of this kind] previously and I think that’s sort of opened up some gaps in knowledge for us and some opportunities for learning in the future … what kind of support we’d need to come with that trial to help it be a success in this culture ” (C3P02 palliative care physician and clinical champion).

Articulate and deliver on the intervention’s value proposition

Interviews highlighted the importance of articulating the intervention’s value proposition to every member of the workforce and maintaining engagement by demonstrating benefits over time. At Centre 5, some participants perceived that the intervention had been imposed by management rather than generated from clinical priorities: “…senior staff say [to researchers] ‘come to our clinics, but we expect everyone else to do the work’ ” (C5P05 radiation oncologist). This was compounded by a perceived lack of communication about the project, which limited personnels’ opportunity to take a more active role even when they were motivated to do so: “ I would have facilitated [the intervention] … but I didn’t know about it ” (C5P01 nurse practitioner). Eliciting and maintaining engagement was said to be additionally challenged at this centre by high staff turnover, especially among junior medical officers on rotation: “ it was very accepted by the junior medical staff [but] I think, unfortunately, when there’s a relatively high turnover of staff … ” (C5P07 radiation oncology trainee). At two other centres, turnover among personnel required a transition in the role of clinical champion, interrupting support for the intervention while the new incumbents familiarised themselves with the role.

Across centres, participants reported reservations among some of their colleagues regarding the project’s fundamental premises, including the assumption that pain care needed improving at their centre (“ they actually felt this trial was a little bit insulting for their clinical skills. There was a bit of eye rolling and ‘of course we do that already!’ ” (C3P02 palliative care physician and clinical champion)) or that pain warranted a specific focus rather than symptoms more generally: “ I find it more useful when more than one symptom is targeted ” (C5P06 palliative care physician).

More specific criticism was also levelled at each of the intervention strategies as follows.

Pain screening

In the case of screening, two participants questioned the validity of a 0–10 numerical rating scale (NRS) for different reasons: “ sometimes getting the numbers breaks the flow of the narrative” (C6P04 medical oncologist); “they [patients] would say, ‘no, I’m not in pain but I have a lot of discomfort when I swallow’ - it was in the wording ” C5P02 registered nurse). Even one of the clinical champions felt that screening was redundant where pain was very severe: “ if someone is clearly in a pain crisis, you don’t need to be asking … you kind of know what number - they might tell you it’s 15 [out of 10] ” (C6P02 palliative care physician and clinical champion). Perceptions of the value of screening were also influenced by the degree to which it led to demonstrable improvements in pain care, which was undermined by problems with establishing an efficient process at some centres: “ I think I’ve still probably got stray [pain screening] forms on my desk ” (C3P06 palliative care physician). A lack of understanding among personnel and patients about how screening might lead to better pain outcomes was said to result in “ fatigue ” (C5P03 clinical nurse consultant [clinical nurse consultant]; C1P01 palliative care physician and clinical champion), manifest as a downward spiral of effort in, and value from, screening.

Audit and feedback

The audit and feedback strategy attracted limited attention from personnel at most centres: “ I don’t think that the audit and feedback were terribly noticeable ” (C4P01 medical oncologist and clinical champion). At the centre where only the palliative care department participated, one participant perceived baseline audit results to be acceptable and therefore demotivating for change: “[ the audit results showed] we were doing a good job even ahead of time … it did sort of make you think – ‘well where do we go from here?’ ” (C6P04 pain medicine physician). At another centre, motivation among personnel to improve on less favourable audit findings was perceived to depend on whether they prioritised pain care to start with: “ people have come up to me and said, ‘Gee, we really did very badly didn’t we?’ … but they’re not necessarily the people who don’t treat pain well - that’s the problem ” (C1P01 palliative care physician and clinical champion).

Spaced education for health professionals

Participants’ opinion on the value of the online spaced education depended on discipline and seniority, with nurses and junior medical officers reporting benefits “( it gave me a bit more confidence that I was on the right track” (C5P01 nurse practitioner)) but consultant physicians perceiving the knowledge level too “basic” (C6P04 pain medicine physician) or questioning advice from online spaced education that their responses were ‘wrong’: “…some of the multiple answers could have been equally valid” (C504 medical oncologist and clinical champion). Where consultants remained engaged, motivation was said to rely on cultivating “ competition” between colleagues (C602 palliative care physician and clinical champion). Inevitably, the voluntary nature of online spaced education also meant that only motivated personnel engaged to begin with.

Patient self-management resource

All participants who had used the patient self-management resource perceived at least some value. However, its use was limited by barriers relating to role and process considered below.

Define roles and responsibilities

Among the most commonly voiced barriers was a lack of clarity about which specialties and disciplines should be responsible for pain screening, patient education and management. This was usually described in terms of a ‘lack of time’ for pain care relative to other duties afforded greater priority within their scope of practice. Perspectives on roles and responsibilities are considered separately for each aspect of pain care as follows.

While most centres allocated the clinical task of pain screening to clinic receptionists, there was widespread reflection that this had been suboptimal. The only participating clinic receptionist felt that pain screening fell outside her area of responsibility: “but I’m an administrative person - I don’t have anything to do with pain management ” (C2P03 clinic receptionist). Clinician participants across disciplines similarly perceived that pain screening required clinical expertise to assist patients with reporting their pain and triage for urgent follow-up: “ you need somebody talking to the patients, rather than just handing the form, say ‘fill this in’ ” (C2P04 clinical nurse consultant). One centre that recognised this early on reallocated screening from an administrative to a nursing role, leading to substantial improvements in the completeness and quality of data: “ it made a big difference and certainly improved our ability to recognise people who had pain and allowed access for those people who were in severe pain to medications or at least an assessment … implementation through the clerical staff was not a long-term strategy ” (C1P01 palliative care physician and clinical champion).

Patient education

There was little consensus on which disciplines should be responsible for supporting patients to use the self-management resource, with medical personnel deferring to nurses and vice-versa. Role allocation was challenged by the diverse components within the resource, with each perceived to fall within a different scope of practice: “ pain is something I always do as an assessment … [but] … I’m not managing the pain … I’ll review and make recommendations and talk about the pain diaries and discussing their diary with their palliative care doctor or their general practitioner. And I would encourage that process. [But] I wouldn’t be the one that’s setting the goals on their daily activities and stuff ” (C5P01 nurse practitioner). Some oncology nursing roles were perceived to focus on chemo- or radiotherapy protocols to the exclusion of supportive care unless symptoms arose from, or impeded, treatment. Meanwhile, oncologists tended to interpret their role as solely focused on prescribing rather than also encompassing patient education: “ junior doctors only [have] 15 minutes to take a history and everything. [They] could enter in meds [into the patient resource] if everything else is done by someone else … part of me knows it’s [patient resource] important, but the other part of me - I just - when will I have time in my clinical practice to do it? ” (C5P05 radiation oncologist).

Pain management

Some oncologists viewed even pharmacological pain management as peripheral to their scope of practice when consultation time was short, prioritising cancer treatment instead. These participants viewed their role as limited to referring to palliative medicine or pain specialists, especially where pain was believed to have causes other than cancer: “ if the pain is a complex pain where the patient doesn’t have evidence of cancer, and it may be treatment-related, then in those scenarios we tend to divert to the chronic pain team ” (C5P07 radiation oncology advanced trainee). While participants from palliative care and pain medicine welcomed referrals for complex cases, they felt that oncologists sometimes referred for pain they could have easily managed themselves: “ what about some regular paracetamol? … These are things that you’d expect any junior doctors, never mind consultants [to have provided advice on] ” (C5P06 palliative care physician).

Integrate within existing systems and processes

Participants from several centres expressed a view that the intervention’s complex nature had proven overwhelming for systems and processes at their centres. At two centres, integration was especially challenged by broader infrastructure shifts and process failures that limited receptiveness to further changes. Participants at several centres emphasised the process-driven nature of oncology services and the challenge of changing established processes: “ they have got a pro forma that they use for chemo-immunotherapy review, and pain is not part of it, and that perhaps needs more of an organisational nuance … why doesn’t pain feature as a clinical outcome as part of the chemotherapy, immunotherapy review?” (C6P01 clinical nurse consultant). Participants emphasised the need to integrate pain care into existing processes to help personnel understand what was expected of them: “…nursing staff were getting them [screening forms] in the patient’s files and going, ‘what am I supposed to do with this?’ ” (C2P04 clinical nurse consultant). Moreover, centres’ focus on cancer treatment meant that pain care struggled to gain traction even when a process could be instituted: “ unless pain is the presenting complaint and is at the forefront it goes into those, sorts of, you know, the ‘other details’ ” (C5P06 palliative care physician). For the palliative care centre, where pain care was already prioritised, there were doubts about how the proposed process improved on those already in place: “ I generally ask pretty detailed questions about pain anyway [so don’t need patients to be screened in the waiting room] ” (C6P04 pain medicine physician).

Suggestions for better integrating the intervention included “in-building” (C3P04 medical oncologist) responsibility for the strategies within new staff roles or introducing the strategies gradually by means of a “ multistep process” (C5P04 medical oncologist, head of department and clinical champion). Features of two strategies were singled out as having positive potential for supporting existing processes of care. The patient resource was said to “ facilitate communication between the oncology teams and the palliative care team ” (C5P05 radiation oncologist) and serve as a “ visual cue ” (C3P02 medical oncologist) to cover educational topics that “ they might have otherwise forgotten ” (C2P01 palliative care physician and clinical champion). Participants also found the spaced education email administration, spacing and repetition “ easy to manage ” (C2P01 palliative care physician and clinical champion) within their daily routines.

Promote patient-clinician partnership on pain care

Several participants expressed surprise at the prevalence of moderate-severe pain in screening results, and acknowledged that this revealed under-reporting of pain in usual care. Under-reporting was perceived to stem partly from patient expectations that pain from cancer was “ normal ” (C4P03 nurse practitioner) and to be especially common in the context of certain generational or cultural attitudes towards pain and opioids (“ I certainly think there’s a cultural element but there’s also your elderly patients who you know have been through the war and they’re just used to coping with things and you just suck it up … it’s like a badge of honour to be able to say ‘I’m not one of these pill-takers ’” (C3P03 registered nurse [RN])) or when patients were concerned that reporting pain might reduce their fitness for anti-cancer treatment: “[ patients might think that] if I tell them honestly how crappy I am with other symptoms and pain and everything, then they might stop my chemo” (C3P02 palliative care physician). Several participants perceived that under-reporting was also due to patients taking an overly passive role in consultations: “[clinicians assume that] if the patient doesn’t bring it up, it’s not a problem for them and … then the patient [is] thinking ‘the doctor will only talk about important things that are important for me and I won’t mention it because obviously it’s not important’ ” (C3P02 palliative care physician and clinical champion).

The screening component of the intervention was considered to address under-reporting by “ normal[ising] ” pain care, thus encouraging disclosure. The patient resource was also considered helpful for building patient capability to partner with clinicians on pain management by “ encouraging self-efficacy ” (C2P01 palliative care physician and clinical champion) through the tools it provided and its positive message that “ you can get control of your pain ” (C3P02 palliative care physician and clinical champion). It was also perceived to help patients “ keep a record ” (C5P03 clinical nurse consultant) of breakthrough pain and analgesia to discuss in their consultation. However, some participants delineated patient groups who might be less able to use the resource, including those with lower educational levels who struggled to set goals and identify an ‘acceptable’ level of pain balanced against side-effects from pharmacological management. For these patients, it was suggested that too many resources could be overwhelming rather than supportive: “ it’s almost like, the more resources they have, the less resourced there are ” (C5P06 RN). At one centre with an especially diverse demographic, patients were said to require substantial support even to understand the purpose and process of pain screening: “ most [patients] look at you going ‘oh, do I have to do anything?’ … They don’t want to read the [instruction] page which is relatively simple ” (C2P03 clinic receptionist).

Invest in clinical champions

All participants perceived the role of clinical champion to be pivotal to the intervention’s success. Champions were perceived to have two major responsibilities: advocating for the intervention among colleagues to boost motivation and providing practical support to build capability.

To be effective advocates, champions were perceived to need support from senior management ( “[leadership of change] it’s got to happen from the top ” (C5P02 RN)) as well as established, cordial relationships with colleagues they could leverage to motivate engagement: “ it also relies on the champion’s personal relationship with the staff which you’re asking to perform these roles and trying to change their management ” (C1P01 palliative care physician and clinical champion). Where champions felt under-supported by management, they relied on moral support from the project team to sustain their advocacy work: “ being the champion, and sometimes being the nagging champion, it actually felt quite nice to have the back-up of other people ” (C1P01 palliative care physician and clinical champion). Both physicians and nurses perceived the champion role might better suit the scope of practice of a junior doctor or senior nurse rather than consultants, based on their willingness to engage and approachability: “ realistically, you’re probably always going to get more engagement with registrars compared to consultants, unless it’s their own trial ” (C5P07 radiation oncologist); “ just give it [the role] to the CNCs [clinical nurse consultants] because as a general rule they’re the best at everything and have the best relationships with the patient ” (C3P04 medical oncologist).

From a practical perspective, clinical champions were expected to provide human resources for establishing and supporting pain screening and patient education: “ you need a body ” (C2P04 clinical nurse consultant). Unfortunately, however, champions across centres reported having limited time protected for the role within their usual duties: “ there just wasn’t the manpower to do that here ” (C3P02 palliative care physician and clinical champion). One suggestion for boosting capacity was to narrow the focus to one clinic and delegate practical tasks to less senior delegates than required for advocacy to render the time commitment more cost-effective: “[ it] might have been better to focus on one clinic and have full-time … junior nurse ” (C5P05 radiation oncologist). This presented an opportunity to train more than one clinical champion to provide better coverage across shifts and safeguard against the risk of losing champions to staff turnover.

Increasing pain awareness is the first step: Plan for slow incremental change rather than rapid uptake

While the barriers above meant only modest practice changes could be achieved, champions at half the centres perceived incremental progress had been made through increasing awareness among personnel regarding pain care as a focus for improvement: “ I think just trying to make pain something that people think about was probably one of the better strategies ” (C1P01 palliative care physician and clinical champion); it’s more at the top of our minds to remember, to screen the pain at every visit ” (C2P01 palliative care physician and clinical champion); “ I think it has highlighted those issues for us and we now need to take this on ” (C5P04 medical oncologist, head of department and clinical champion). Both nursing and medical participants at Centre 5 emphasized the need to be persistent in striving for continuous improvement: “ I think to get practice change, even for well-motivated people, I think it just needs to be pushed … they’ve done similar things with hand washing for doctors and it’s finally getting through ” (C504 medical oncologist and clinical champion); “ it would take more than just one of these kind of programs to get people to change ” (C5P03 clinical nurse consultant). Encouragingly, participants at this and one other centre expected some clinicians to continue using the patient education booklet and resource after the project ended: “ I’d just love to continue using these booklets ” (C5P02 RN); “[the] patient-held resource has been useful and has been taken up by people and I think they will continue to use those ” (C6P02 palliative care physician and clinical champion).

This qualitative sub-study of a cluster randomized controlled trial identified centre-level capabilities to be the most influential factors impeding or facilitating guideline implementation strategies for improving pain care for outpatients with cancer. Findings suggest that future initiatives of this kind should: consider centre readiness for change; articulate and deliver on the intervention’s value proposition; define clinician roles and responsibilities; integrate the intervention within existing systems and processes; promote patient partnership; invest in the clinical champion role, drawing from senior nurses and junior doctors, with support from medical leaders and management; and design the initiative around slow incremental change rather than rapid uptake.

Our findings are largely consistent with those from an ethnographic study exploring factors influencing implementation of cancer pain guidelines in Korean hospital cancer units, which identified a ‘lack of receptivity for change’ to be a key barrier [ 28 ]. However, observations from the Korean study suggested that a lack of centre leadership and cultural norms regarding nursing hierarchy were the most important underlying factors, whereas our Australian sample focused more on constraints imposed by centre systems and processes and a lack of clarity regarding disciplinary roles. These factors were consistently emphasized regardless of participants’ discipline and seniority, including by one centre’s head of department. Consistent with these findings, a recent Australian qualitative sub-study of anxiety/depression guideline implementation in oncology centres found greater role flexibility to be a key factor underpinning organisational readiness for change [ 29 ]. This team also provided quantitative evidence consistent with our finding that centres’ readiness for change is associated with personnel’s perception of benefit from guideline implementation [ 30 ]. Future initiatives should work harder to persuade clinicians of the intervention’s rationale and evidence base prior to commencement, given that perceptions of coherence and effectiveness are key dimensions of acceptability required for clinicians to invest time and effort [ 31 ]. Since our Trial was conducted, evidence has emerged for an impact from cancer symptom screening on survival that could be used persuasively [ 32 ]. Furthermore, the spaced education module might be more acceptable if made adjustable to the knowledge levels of a broader range of clinicians.

Other studies on implementation of cancer pain guidelines [ 11 , 13 ] suggest that structured approaches to process change tend to be more successful than less prescriptive approaches of the kind taken in the Stop Cancer PAIN Trial. We provided centres with guideline implementation strategies but no clear guidance on how to integrate these within existing contexts - i.e. implementation of the implementation, or ‘meta-implementation’. It was wrongly assumed that clinical champions could support integration with centre processes based on their knowledge of local context, but this turned out to be unreasonable given champions’ limited time for the role and lack of training in change management. Like most research to date [ 33 , 34 ], our trial focused largely on the advocacy role played by clinical champions, neglecting more practical and time consuming aspects that our interviews identified to be just as important. We join others in calling for more research on the mechanisms by which clinical champions can optimally facilitate change and ways to maximize their efficacy through training and support [ 24 ]. This should include exploration of optimal models by which different aspects of the champion role might be shared between more than one person where no-one is available with all the necessary attributes, as well as ways to ensure sustainability after support from the project team is withdrawn.

Theory-based research suggests that adding complex interventions to complex healthcare systems creates dynamic interplay and feedback loops, making consequences hard to predict [ 35 ]. In the current trial, this was likely exacerbated by our attempt to combine multiple strategies targeting patient, clinician and centre levels. We chose each strategy based on evidence for its stand-alone efficacy, and combined strategies rather than used them singly with the intent of leveraging complementary mechanisms, as recommended by the COM-B Model and US Institute of Medicine [ 36 ]. However, findings from our interviews suggest that interactions between the strategies and local processes separated their spheres of influence, precluding intended synergies. The Stop Cancer PAIN Trial is not alone in having over-estimated the value of combining guideline implementation strategies; a recent systematic review found that 8 other multi-component interventions similarly demonstrated limited effects on guideline adherence and patient outcomes [ 37 ]. Collectively, these findings suggest that future attempts at combining strategies should consider complex systems theory as well as behaviour change frameworks at each of a number of stages [ 38 ]. Alternatively, a more manageable approach for most cancer centres might be to focus on just one component at a time, periodically reviewing progress against SMART goals and, depending on results, supplementing with additional components using plan-do-study cycles [ 39 ].

Given the challenges with integrating screening into centre processes, it seems unlikely that improvements in pain scores during the control phase reported in our primary results article were due to the spontaneous use of screening data in consultations [ 18 ]. Indeed, while routine use of patient-reported outcome measures (PROMs) in oncology has been researched for more than a quarter-century [ 40 ], benefits to patient outcomes have only recently been demonstrated in the context of electronically-administered PROMs (ePROMs) that enable remote self-reporting, real-time feedback to clinicians, and clinician-patient telecommunication [ 12 ]. Further research is needed on how best to support clinician engagement with ePROMs, including training on how to use results in partnership with patients to assist shared decision-making and self-management [ 41 ].

A worrying finding from the current study was that some or all aspects of pain care were perceived to fall between the scopes of practice for oncology clinicians from each discipline. Clinical practice guidelines emphasize the need for pain care to be inter-disciplinary in recognition of the need for comprehensive assessment, non-pharmacological as well as pharmacological management, and patient education and support for self-management [ 42 ]. While the patient self-management resource included in the intervention was perceived to support communication between clinicians and patients, its potential for assisting coordination of care between disciplines was limited where roles and responsibilities were not previously established. Our findings and other research suggest that future initiatives may benefit from ‘process mapping’ with clinicians to identify where clinical workflow and roles might be reconfigured to incorporate the various aspects of pain care in the most efficient ways that do not substantially add to workload [ 41 ].

Patient education has been proven to improve pain outcomes by clinical trials [ 43 , 44 ], and we have argued previously that supporting pain self-management should be core business for all clinicians working in cancer care [ 45 ]. The ‘coaching’ approach needed to empower patients to recognize themselves as ‘experts’ on their pain and equal partners with clinicians in its management is iterative rather than a single event, and is ideally built on established and ongoing therapeutic relationships of trust with a particular team member. However, findings from patient education research more generally suggest that patient education and behaviour change is also optimally supported when key messages are reinforced by differing disciplinary perspectives [ 46 ]. Results from the current study suggest that these principles of pain care need more formal recognition within the scope of practice of oncology clinicians to ensure they are afforded sufficient time alongside anti-cancer treatment and related supportive care. Findings also indicate that clinicians may require training in the person-centred, partnership-oriented aspects of pain care beyond the educational approach used in the Stop Cancer PAIN Trial and other research [ 47 ]. Such training should be repeated regularly to ensure it reaches the majority of personnel at cancer centres, allowing for turnover.

Limitations

The current study had several limitations. Transferability even within Australia is limited by a focus on metropolitan services in only three out of eight jurisdictions. Data relied on clinician perspectives, and the response rate was less than one quarter of personnel at each centre, with the disciplines and specialties of participants being unrepresentative of centre workforces. Over-sampling of medical compared to nursing personnel likely reflects the fact that all clinical champions were medical consultants, while the predominance of palliative care physicians among medical participants presumably arises from the central focus this specialty has on pain care. Notably, our sample included no perspectives from allied health disciplines, despite the important roles these can play in non-pharmacological pain management. Confirmability was threatened by the potential for cognitive bias among researchers towards a favourable view of the intervention given their long-standing investment as members of the project team. We attempted to offset this by explicitly inviting criticism of the intervention from participants, and having the initial analysis conducted by researchers with no prior involvement in the project. A final limitation concerns reliance on the COM-B Model for analysis rather than an alternative framework or more inductive approach. While the COM-B has been widely used to explore barriers and facilitators across a wide range of healthcare interventions, we applied the model in a somewhat novel way to systems and processes as well as individuals’ behaviour after finding that participants perceived their agency to be majorly constrained by these. An implementation framework such as the integrated-Promoting Action on Research Implementation in Health Service (i-PARIHS) framework (iPARIHS) [ 48 ] or Consolidated Framework for Implementation Research (CFIR) [ 49 ] would have conceived of factors and their relationships in alternative ways that might have proven equally informative [ 50 ].

This qualitative sub-study elucidated important factors influencing the success of guideline implementation strategies at six cancer centres in the Stop Cancer PAIN Trial. Findings underscore the value that a qualitative approach offers for understanding the role of context when evaluating complex interventions [ 51 ]. Ultimately, the Stop Cancer PAIN Trial may have been overly ambitious in the scale of its intervention, especially given limited resources available at each centre. Further research is needed to understand how multi-component guideline implementation strategies can be optimally introduced within the context of local roles, systems and processes.

Availability of data and materials

The qualitative interview datasets generated and analysed during the current study are not publicly available due to the conditions of ethical approval which acknowledge the risk of participant re-identification.

Van Den Beuken-Van MH, Hochstenbach LMJ, Joosten EAJ, Tjan-Heijnen VCG, Janssen DJA. Update on prevalence of pain in patients with cancer: systematic review and meta-analysis. J Pain Symptom Manag. 2016;51(6):1070–90.

Article   Google Scholar  

Luckett T, Davidson PM, Green A, Boyle F, Stubbs J, Lovell M. Assessment and management of adult cancer pain: a systematic review and synthesis of recent qualitative studies aimed at developing insights for managing barriers and optimizing facilitators within a comprehensive framework of patient care. J Pain Symptom Manag. 2013;46(2):229–53.

Oldenmenger WH, Sillevis Smitt PAE, van Dooren S, Stoter G, van der Rijt CCD. A systematic review on barriers hindering adequate cancer pain management and interventions to reduce them: a critical appraisal. Eur J Cancer. 2009;45(8):1370–80.

Article   PubMed   Google Scholar  

Jacobsen R, Sjogren P, Moldrup C, Christrup L. Physician-related barriers to cancer pain management with opioid analgesics: a systematic review. J Opioid Manag. 2007;3(4):207–14.

Jacobsen R, Moldrup C, Christrup L, Sjogren P. Patient-related barriers to cancer pain management: a systematic exploratory review. Scand J Caring Sci. 2009;23(1):190–208.

Jacobsen R, Liubarskiene Z, Moldrup C, Christrup L, Sjogren P, Samsanaviciene J. Barriers to cancer pain management: a review of empirical research. Medicina (Kaunas). 2009;45:427–33.

Fazeny B, Muhm M, Hauser I, Wenzel C, Mares P, Berzlanovich A, et al. Barriers in cancer pain management. Wiener Klinische Wochenschrift. 2000;112(22):978–81.

CAS   PubMed   Google Scholar  

Makhlouf SM, Pini S, Ahmed S, Bennett MI. Managing pain in people with cancer—a systematic review of the attitudes and knowledge of professionals, patients, caregivers and Public. J Cancer Educ. 2020;35(2):214–40.

Roberto A, Greco MT, Uggeri S, Cavuto S, Deandrea S, Corli O, et al. Living systematic review to assess the analgesic undertreatment in cancer patients. Pain Pract. 2022;22(4):487–96.

Brink-Huis A, van Achterberg T, Schoonhoven L, Brink-Huis A, van Achterberg T, Schoonhoven L. Pain management: a review of organisation models with integrated processes for the management of pain in adult cancer patients. J Clin Nurs. 2008;17(15):1986–2000.

Du Pen SL, Du Pen AR, Polissar N, Hansberry J, Kraybill BM, Stillman M, et al. Implementing guidelines for cancer pain management: results of a randomized controlled clinical trial. J Clin Oncol. 1999;17(1):361–70.

Article   CAS   PubMed   Google Scholar  

Govindaraj R, Agar M, Currow D, Luckett T. Assessing patient-reported outcomes in routine cancer clinical care using electronic administration and telehealth technologies: Realist synthesis of potential mechanisms for improving health outcomes. J Med Internet Res. 2023;25:e48483.

Article   PubMed   PubMed Central   Google Scholar  

Koesel N, Tocchi C, Burke L, Yap T, Harrison A. Symptom distress: implementation of palliative care guidelines to improve pain, fatigue, and anxiety in patients with advanced cancer. Clin J Oncol Nurs. 2019;23(2):149–55.

PubMed   Google Scholar  

Dulko D, Hertz E, Julien J, Beck S, Mooney K. Implementation of cancer pain guidelines by acute care nurse practitioners using an audit and feedback strategy. J Am Acad Nurse Pract. 2010;22(1):45–55.

Samuelly-Leichtag G, Adler T, Eisenberg E. Something must be wrong with the implementation of cancer-pain treatment guidelines. A lesson from referrals to a pain clinic. Rambam Maimonides Med J. 2019;10(3):18.

Lopez Lopez R, Camps Herrero C, Khosravi-Shahi P, Guillem Porta V, Carrato Mena A, Garcia-Foncillas J, et al. Oncologist’s knowledge and implementation of guidelines for breakthrough cancer pain in Spain: CONOCE study. Clin Transl Oncol. 2018;20(5):613–8.

Luckett T, Phillips J, Agar M, Lam L, Davidson PM, McCaffrey N, et al. Protocol for a phase III pragmatic stepped wedge cluster randomised controlled trial comparing the effectiveness and cost-effectiveness of screening and guidelines with, versus without, implementation strategies for improving pain in adults with cancer attending outpatient oncology and palliative care services: the Stop Cancer PAIN trial. BMC Health Serv Res. 2018;18(1):1–13.

Lovell MR, Phillips JL, Luckett T, Lam L, Boyle FM, Davidson PM, et al. Effect of cancer pain guideline implementation on pain outcomes among adult outpatients with cancer-related pain: a stepped wedge cluster randomized trial. JAMA Netw Open. 2022;5(2):1–12.

Lovell M, Luckett T, Boyle F, Stubbs J, Phillips J, Davidson PM, et al. Adaptation of international guidelines on assessment and management of cancer pain for the Australian context. Asia-Pac J Clin Oncol. 2015;11(2):170–7.

Phillips JL, Heneka N, Bhattarai P, Fraser C, Shaw T. Effectiveness of the spaced education pedagogy for clinicians’ continuing professional development: a systematic review. Med Educ. 2019;53(9):886–902.

Cancer Council Australia. Overcoming cancer pain: A guide for people with cancer, their families and friends 2013 [Available from: http://www.cancercouncil.com.au/wp-content/uploads/2014/01/Can487-Overcoming-Pain-NSW-Lores2.pdf .

Luckett T, Davidson PM, Green A, Marie N, Birch MR, Stubbs J, et al. Development of a cancer pain self-management resource to address patient, provider, and health system barriers to care. Palliat Support Care. 2019;17(4):472–8.

Michie S, Atkins L, West R. The behaviour change wheel: a guide to designing interventions. London: Silverback Publishing; 2014.

Google Scholar  

Santos WJ, Graham ID, Lalonde M, Demery Varin M, Squires JE. The effectiveness of champions in implementing innovations in health care: a systematic review. Implement Sci Commun. 2022;3(1):80.

Ramanadhan S, Revette AC, Lee RM, Aveling EL. Pragmatic approaches to analyzing qualitative data for implementation science: an introduction. Implement Sci Commun. 2021;2(1):70.

Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19(6):349–57.

Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013;13(1):117.

Kim M, Jeong S, McMillan M, Higgins I. Translation of policy and guidelines into practice: lessons learned from implementation of the cancer pain management guideline. J Hosp Manage Health Policy. 2020;4.  https://doi.org/10.21037/jhmhp.2020.02.01 .

Geerligs L, Shepherd HL, Butow P, Shaw J, Masya L, Cuddy J, et al. What factors influence organisational readiness for change? Implementation of the Australian clinical pathway for the screening, assessment and management of anxiety and depression in adult cancer patients (ADAPT CP). Support Care Cancer. 2021;29(6):3235–44.

Faris MM, Shepherd HL, Butow PN, Kelly P, He S, Rankin N, et al. Staff- and service-level factors associated with organisational readiness to implement a clinical pathway for the identification, assessment, and management of anxiety and depression in adults with cancer. BMC Health Serv Res. 2023;23(1):866.

Sekhon M, Cartwright M, Francis JJ. Acceptability of healthcare interventions: an overview of reviews and development of a theoretical framework. BMC Health Serv Res. 2017;17(1):88.

Caminiti C, Maglietta G, Diodati F, Puntoni M, Marcomini B, Lazzarelli S, et al. The effects of patient-reported outcome screening on the survival of people with cancer: a systematic review and meta-analysis. Cancers. 2022;14(21):5470.

Morena AL, Gaias LM, Larkin C. Understanding the role of clinical champions and their impact on clinician behavior change: the need for causal pathway mechanisms. Front Health Serv. 2022;2:896885.

Miech EJ, Rattray NA, Flanagan ME, Damschroder L, Schmid AA, Damush TM. Inside help: an integrative review of champions in healthcare-related implementation. SAGE Open Med. 2018;6:2050312118773261.

Braithwaite J, Churruca K, Long JC, Ellis LA, Herkes J. When complexity science meets implementation science: a theoretical and empirical analysis of systems change. BMC Med. 2018;16:1–14.

Institute of Medicine (US) Committee on Standards for Developing Trustworthy Clinical Practice Guidelines. Clinical Practice Guidelines We Can Trust. In: Graham R, Mancher M, Miller Wolman D, Greenfield S, Steinberg E, editors. Washington, D.C.: National Academies Press; 2011.

Bora A-M, Piechotta V, Kreuzberger N, Monsef I, Wender A, Follmann M, et al. The effectiveness of clinical guideline implementation strategies in oncology—a systematic review. BMC Health Serv Res. 2023;23(1):347.

McGill E, Er V, Penney T, Egan M, White M, Meier P, et al. Evaluation of public health interventions from a complex systems perspective: a research methods review. Soc Sci Med. 2021;272: 113697.

Institute for Healthcare Improvement. Quality improvement essentials toolkit. 2017.

Trowbridge RDW, Jay SJ, et al. Determining the effectiveness of a clinical-practice intervention in improving the control of pain in outpatient with cancer. Acad Med. 1997;72:798–800.

Howell D, Rosberger Z, Mayer C, Faria R, Hamel M, Snider A, et al. Personalized symptom management: a quality improvement collaborative for implementation of patient reported outcomes (PROs) in ‘real-world’ oncology multisite practices. J Patient Rep Outcomes. 2020;4(1):47.

Australian Adult Cancer Pain Management Guideline Working Party. Autralian clinical pathway for screening, assessment and management of cancer pain in adults Sydney, Australia: Cancer Council Australia. 2013. http://wiki.cancer.org.au/australia/Guidelines:Cancer_pain_management/Flowchart_overview .

Marie N, Luckett T, Davidson P, Lovell M, Lal S. Optimal patient education for cancer pain: a systematic review and theory-based meta-analysis. Support Care Cancer. 2013;21:3529–37.

Abernethy AP, Currow DC, Shelby-James T, Rowett D, May F, Samsa GP, et al. Delivery strategies to optimize resource utilization and performance status for patients with advanced life-limiting illness: results from the palliative care trial [ISRCTN 81117481]. J Pain Symptom Manage. 2013;45(3):488–505.

Lovell MR, Luckett T, Boyle FM, Phillips J, Agar M, Davidson PM. Patient education, coaching, and self-management for cancer pain. J Clin Oncol. 2014;32(16):1712–20.

Luckett T, Roberts MM, Smith T, Swami V, Cho JG, Wheatley JR. Patient perspectives on how to optimise benefits from a breathlessness service for people with COPD. NPJ Prim Care Respir Med. 2020;30(1). https://doi.org/10.1038/s41533-020-0172-4 .

Phillips JL, Heneka N, Lovell M, Lam L, Davidson P, Boyle F, et al. A phase III wait-listed randomised controlled trial of novel targeted inter-professional clinical education intervention to improve cancer patients’ reported pain outcomes (the Cancer Pain Assessment (CPAS) Trial): study protocol. Trials. 2019;20:1–12.

Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11(1):33.

Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated consolidated framework for implementation research based on user feedback. Implement Sci. 2022;17(1):1–16.

Geerligs L, Rankin NM, Shepherd HL, Butow P. Hospital-based interventions: a systematic review of staff-reported barriers and facilitators to implementation processes. Implement Sci. 2018;13(1):36.

Skivington K, Matthews L, Simpson SA, Craig P, Baird J, Blazeby JM, et al. A new framework for developing and evaluating complex interventions: update of medical research council guidance. BMJ. 2021;374:n2061.

Download references

Acknowledgements

The authors would like to dedicate this article to the memory of Sally Fielding, who worked as a valued member of the project team throughout the Stop Cancer PAIN Trial. We would also like to acknowledge the contributions of project manager A/Prof Annmarie Hosie, data manager Dr Seong Cheah, and research assistant Layla Edwards.

This research was supported by a grant from the National Breast Cancer Foundation.

Author information

Authors and affiliations.

IMPACCT Centre—Improving Palliative, Aged and Chronic Care through Clinical Research and Translation, Faculty of Health, University of Technology Sydney (UTS), Building 10, 235 Jones St, Ultimo, Sydney, NSW, 2007, Australia

Tim Luckett, Meera Agar & Maja Garcia

School of Nursing and Centre for Healthcare Transformation, Queensland University of Technology (QUT), Brisbane, QLD, Australia

Jane Phillips

South West Sydney School of Clinical Medicine, University of New South Wales (UNSW), Sydney, NSW, Australia

The Limbic, Sydney, Australia

Linda Richards

Palliative Care Department, Greenwich Hospital, HammondCare, Sydney, NSW, Australia

Najwa Reynolds & Melanie Lovell

University of Wollongong, Wollongong, NSW, Australia

Patricia Davidson & David Currow

Charles Perkins Centre, School of Medical Sciences, The University of Sydney, Sydney, NSW, Australia

Patricia Ritchie Centre for Cancer Care and Research, The University of Sydney, Sydney, NSW, Australia

Frances Boyle

Northern Medical School, The University of Sydney, Sydney, NSW, Australia

Frances Boyle & Melanie Lovell

Macau University of Science and Technology, Macau, China

Lawrence Lam

Deakin Health Economics, Institute for Health Transformation, School of Health and Social Development, Deakin University, Melbourne, Australia

Nikki McCaffrey

You can also search for this author in PubMed   Google Scholar

Contributions

TL, JP, MA, PMD, TS, DCC, FB, LL, NM and ML contributed to the concept and design of this research. TL, LR, MR, MG and ML contributed to the acquisition, analysis or interpretation of the data. TL and ML contributed to drafting of the manuscript. All authors contributed to revisions of the manuscript and approved the final version.

Corresponding author

Correspondence to Tim Luckett .

Ethics declarations

Ethics approval and consent to participate.

This research was approved by the Southwestern Sydney Local Health District Human Research Ethics Committee (HREC/14/LPOOL/479). All participants gave written informed consent to participate.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Luckett, T., Phillips, J., Agar, M. et al. Factors influencing fidelity to guideline implementation strategies for improving pain care at cancer centres: a qualitative sub-study of the Stop Cancer PAIN Trial. BMC Health Serv Res 24 , 969 (2024). https://doi.org/10.1186/s12913-024-11243-1

Download citation

Received : 10 April 2024

Accepted : 25 June 2024

Published : 22 August 2024

DOI : https://doi.org/10.1186/s12913-024-11243-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Implementation
  • Qualitative

BMC Health Services Research

ISSN: 1472-6963

method of data analysis in qualitative research

  • Open access
  • Published: 19 August 2024

Updating a conceptual model of effective symptom management in palliative care to include patient and carer perspective: a qualitative study

  • Emma J. Chapman 1 ,
  • Carole A. Paley 1 ,
  • Simon Pini 2 &
  • Lucy E. Ziegler 1  

BMC Palliative Care volume  23 , Article number:  208 ( 2024 ) Cite this article

64 Accesses

Metrics details

A conceptual model of effective symptom management was previously developed from interviews with multidisciplinary healthcare professionals (HCP) working in English hospices. Here we aimed to answer the question; does a HCP data-derived model represent the experience of patients and carers of people with advanced cancer?

Semi-structured interviews were undertaken with six patients with advanced cancer and six carers to gain an in-depth understanding of their experience of symptom management. Analysis was based on the framework method; transcription, familiarisation, coding, applying analytical framework (conceptual model), charting, interpretation. Inductive framework analysis was used to align data with themes in the existing model. A deductive approach was also used to identify new themes.

The experience of patients and carers aligned with key steps of engagement, decision making, partnership and delivery in the HCP-based model. The data aligned with 18 of 23 themes. These were; Role definition and boundaries, Multidisciplinary team decision making, Availability of services/staff, Clinician-Patient relationship/rapport, Patient preferences, Patient characteristics, Quality of life versus treatment need, Staff time/burden, Psychological support -informal, Appropriate understanding, expectations, acceptance and goals- patients, Appropriate understanding, expectations, acceptance and goals-HCPs, Appropriate understanding, expectations, acceptance and goals- family friends, carers, Professional, service and referral factors, Continuity of care, Multidisciplinary team working, Palliative care philosophy and culture, Physical environment and facilities, Referral process and delays. Four additional patient and carer-derived themes were identified: Carer Burden, Communication, Medicines management and COVID-19. Constructs that did not align were Experience (of staff), Training (of staff), Guidelines and evidence, Psychological support (for staff) and Formal psychological support (for patients).

Conclusions

A healthcare professional-based conceptual model of effective symptom management aligned well with the experience of patients with advanced cancer and their carers. Additional domains were identified. We make four recommendations for change arising from this research. Routine appraisal and acknowledgement of carer burden, medicine management tasks and previous experience in healthcare roles; improved access to communication skills training for staff and review of patient communication needs. Further research should explore the symptom management experience of those living alone and how these people can be better supported.

Peer Review reports

A conceptual model of effective symptom management was previously developed from qualitative data derived from interviews with healthcare professionals working in English hospices to elicit their views about the barriers and facilitators of effective symptom management [ 1 ]. The model delineated the successful symptom management experience into four steps of: engagement, decision-making, partnership and delivery. Constructs contributing to these were identified (Table 1 ).

Our original model was based solely on Healthcare professional (HCP) input. However, the perception of professionals may vary from that of patients and carers. A recent patient and professional survey of needs assessments in an oncology inpatient unit showed discrepancies between perception of unmet needs between staff and patients [ 2 ]. For this reason, we were concerned that what was deemed important by HCP working in palliative care may not mirror the concerns and experience of patients and carers.

Here we aimed to answer the question; does an HCP data-derived model represent the experience of patients and carers of people with advanced cancer?. If necessary, the original conceptual model of effective symptom management will be updated.

Qualitative, semi-structured interviews were chosen to gain an in-depth understanding of the experience from the perspective of a range of patients and carers. All methods were carried out in accordance with the principles of the Declaration of Helsinki. Ethical approval was granted by a UK research ethics committee ( North of Scotland [ 2 ] Research Ethics Committee (20/NS/0086)). Verbal, recorded informed consent was given using a verbal consent script (Supplementary information 1). Our original intention had been to conduct interviews face to face facilitated by a set of laminated prompt cards based upon those used in the HCP interviews. However, adaptation to telephone interviews in patient’s homes was necessary due to COVID-19 restrictions and it became apparent that the card exercise did not work well remotely. We continued interviews based on the interview schedule but without the use of prompt cards. EC is a female, non-clinical senior research fellow in palliative care. She has experience of qualitative interviews and led the development of the original HCP-based model of effective symptom management [ 1 ]. Audio recordings were transcribed verbatim by a senior academic secretary.

Recruitment

Participants who met the inclusion criteria were identified by a research nurse at the participating hospice. Eligible patients were those who met all 5 criteria:

Diagnosed with advanced disease (i.e., cancer that is considered to be incurable).

Had been referred to the participating hospice.

Were 18 years of age or over.

Were able to speak and understand English.

Were able to give informed consent.

Eligible carers were people who met all 4 criteria:

Were the informal carer of an eligible patient (who may or may not also be participating in the study).

Patients or carers were excluded if they:

Exhibited cognitive dysfunction which would impede their being able to give informed consent and take part in the study.

Were deemed by hospice staff to be too ill or distressed.

Access to the inpatient unit was not possible at this time due to Covid-19 restrictions. The research nurse introduced the study, provided a participant information sheet and completed a consent to contact form. The first contact with the researcher was made by telephone to confirm (or not) interest in participation and answer questions. An interview time not less than 48 h after provision of the participant information sheet, was scheduled. The researcher and the participant information sheet explained the overall aim of the RESOLVE research programme to improve health status and symptom experience for people living with advanced cancer (Supplementary information 2). The verbal consent statements made it clear that this was a conversation for research purposes only and would not have any impact on the care the patient received (Supplementary information 3). Permission was granted that the researcher may contact the clinical team at the hospice if there was a serious concern for welfare that required urgent attention. Verbal informed consent was collected, and audio recorded at the start of the interview with participants answering yes or no to each of the statements in the verbal consent script (Supplementary information 3). Participants were told that we had already interviewed HCPs about what helped or hindered effective symptom management and now we wanted to understand their perspective too.

Data Collection

Interview topic guides (Supplementary information 4 and 5) were used. Interviews were conducted by EC over the telephone and audio recorded onto an encrypted Dictaphone. Files were downloaded onto a secure University of Leeds drive and then deleted from the Dictaphone. No video was recorded. The researcher made brief field notes directly after the interview on impression, emotion and participant backgrounds that were disclosed.

An Excel spreadsheet was used to facilitate data management. We explored the constructs of patient and carer experience as defined by our existing model. An inductive framework analysis was used to align data with themes in the existing conceptual model. A deductive approach was also used to identify new themes not included in the original model. Two researchers (EC and CP) independently conducted framework analysis on all transcripts. Data was then compared and discussed until a consensus data set was developed. The study is reported in accordance with Standards for Reporting Qualitative Research (SRQR) recommendations [ 11 ].

Twelve participants were interviewed in their own homes by telephone. In five interviews a family member or friend was also present, and they were interviewed as a dyad. One interview was with a carer of a patient (patient not interviewed) and one interview was with a patient alone. Interviews lasted between 21 and 45 min. Basic self-declared demographic information was collected (Table 2 ).

One person was approached by a research nurse and provided with participant information sheet. However, when they spoke with the researcher on the telephone it was clear that they had not read the participant information sheet. The individual declined for the information to be read out loud with them. Informed consent could therefore not be given and an interview was not carried out. Upon reflection, this person was keen to informally chat to the researcher but was perhaps seeking social interaction rather than research participation. All other participants completed the interview as planned.

Participant background was relevant as one carer and one patient, had experience of working in healthcare and this may have shaped their experience and understanding. Analysis was based on the framework method; transcription, familiarisation, coding, applying analytical framework (conceptual model), charting, interpretation.

Data aligned with 18 of 23 constructs in the professional based model (Table 3 ). Pseudonyms are used to protect confidentiality.

Four constructs that had featured in the healthcare professional based model did not feature in the patient and carer derived data. These were perhaps not unexpectedly related to characteristics of staff; Experience (of staff), Training (of staff), Psychological support (for staff) and the provision of formal psychological support (for patients). One construct ‘Guidelines and Evidence’ was not explicitly mentioned by patients and carers. However, a carer did comment that at time of referral to the hospice, the patient had been on two different does of co-codamol simultaneously ‘ You were on co-codamol, the 500/8 plus co-codamol 500/30’ (Patricia, carer) which suggested to the researchers that the patient had been taking the medication in a way contrary to guidelines. Medications were then optimised by hospice staff. Four additional patient and carer-derived themes were identified: Carer Burden, Communication, Medicines management and Impact of COVID-19 (Fig. 1 ).

figure 1

The conceptual model of effective symptom management in palliative care was updated to also reflect patient and carer perspective. Specifically, the need for support with communication and medicines management plus consideration of the carer burden were included

Carer burden

Our HCP-based conceptual model identified a role for the carer in shaping symptom management experience in either a positive or negative way [ 1 ]. The patient and carer derived data presented here provides additional insight into their role and the activities required of them. Carer burden is a multifaceted experience, however our interview schedule specifically asked about symptom management experience.

The carer was sometimes responsible for raising concerns and initiating the referral for specialist palliative cares support ‘it was at some stage earlier in this year when I was a little anxious about your health and contacted the chemo wing at (hospital) and one of the nurses there thought it would be helpful to me and Patient to put us in touch with (the hospice) (Kathleen, carer).

Carers were enmeshed into the disease and symptom experience of the patient, referring to ‘we’ when talking about the patient’s cancer treatment, pain and referral to hospice.

Olivia (carer): Immune therapy we’d had a reaction to and we’d resolved the reaction but it concluded in stopping any treatment and we then went to a situation where we were not able to manage the pain from the cancer successfully and it was recommended by our oncologist that (the hospice) may have some expertise that we could….
Olivia (carer): Tap into…as I say that was a difficult decision for us to agree for Anthony to go into (the hospice).

However, on occasion the insight from the carer was not acted upon leading to a delay in support for distressing symptoms ‘ I kept saying to people, he’s losing weight, he’s in pain and they just kept saying well he shouldn’t be in this amount of pain ‘cos of what his bloods are like. And I kept saying well what you’re saying he should be like, I can tell you he’s not like and we’re not ones to you know erm (he) isn’t one to be bothering the doctor.’ (Sandra, carer).

Once the patient was receiving palliative care the carer took responsibility for obtaining and retaining knowledge either because the patient could not, due to memory problems from medication, or their condition, or they were not willing to do this for themselves.

Martin (patient): ‘she knows better than me ‘cos I’m always, I’m not very good at remembering stuff’
Martin (patient): I’m not interested no I understand you do have a very important role and she’s taken the lead on it now, that’s definitely the case’

And with another couple

Terry (patient): Sorry I’ve got my wife at the side of me ‘cos she knows better than me ‘cos I’m always, I’m not very good at remembering stuff.
Stacey (carer): I’m usually present yeah, I’m usually around. I tend to be the one that asks more questions.

However, in our interviews occasionally discordance between patient and carer opinion was seen with the carer rating the symptoms more troublesome than the patient’s recollection.

Interviewer: So was it (the pain) stopping you doing any activities that you had been able to do?
Marti, (patient): Oh I see, not particularly no
Mary (carer): I would probably disagree with that sorry. I would say that Martin’s management of the pain and our management of the pain and everything was kind of a constant thing, that’s all we, you know if felt like we were talking about it all the time, his pain’.

Despite an integral role in facilitating effective symptom management carers could feel unacknowledged, specifically by hospital staff. ‘ at the same time they’re telling me I’m not a carer and yet you know Wendy would be in a very sorry state if I wasn’t on the ball all the time’ (Patricia, carer). Specialist palliative care staff were better at providing acknowledgement and consideration of individual capabilities.

Patricia (carer): ‘So they understand that I’m not sort of hale and hearty and I’ve got my limitations….and it’s just lovely them knowing and actually accepting that I am caring for patient, we are doing the best that we can and that they are there for us.’. This simple step of acknowledgement was appreciated and a factor in allowing the carer to continue to support the patient.
Olivia (carer): ‘You know I do feel that it’s about me as well, it’s not just about Anthony which, it is really all about Anthony but you know it’s important that I continue with my wellbeing in order that I can support and look after him’ .

Communication

The impact of communication of effective symptom management occurred at different levels. As would be expected, communication needed to be tailored to the background, previous experience and outlook of the individual. In particular, we noted that a patient who had a healthcare background themselves welcomed more in-depth discussion and input into decision making.

Andrew (patient): I’ve dealt with people with cancers and terminal illnesses. Yeah, I know about syringe drives and everything…The important thing is to be able to discuss it and with my knowledge of medication as well, I mean I can discuss it in depth.’ .

Interestingly, this person also equated being admitted to the hospice with the use of a syringe driver and end of life, illustrating that regardless of the patient’s professional background, a thorough explanation without any assumptions on understanding would still be necessary. Andrew (patient):  ‘I mean I could go into (the hospice) at any time knowing this but with my work record and everything else, I know what it all entails I mean I’d probably go in and they’d probably want to put me on a syringe drive with Oramorph and Midazolam and Betamethasone and everything else and I know that is the beginning of the end once you start on the syringe driver and everything because it just puts you to sleep and just makes you comfortable and you don’t really have no quality of life’ .

Patients and carers valued being able to get in contact with someone when difficulties arose. Kathleen (carer): ‘Ease of communication is important to us so it’s easy to get in touch with somebody’ .

For some people, at the earlier stages after referral to the palliative care team, the only support that they required was just telephone contact.

Kathleen (carer): ‘What we have at the moment is a phone number to call and another lady, a nurse who actually rings us probably about once a fortnight yeah to check if we have any anxieties, problems.’ .

Palliative care professionals had a key role in mediating communication between patients and carers and other services. Kathleen (carer):  ‘she said yes, do you think Harry would mind us contacting the GP you know and I said I’m sure he would, if I think it’s a good idea he’d go along with it so that’s what we did, she did, she contacted our GP which meant that we got a telephone appointment and something happened very quickly’ .

This extended to explaining the purpose and results of tests such as X-rays.

Stacey (carer): Yeah he went when he was admitted he went for an Xray and that was the hospice, it was (clinical nurse specialist) that had organised that. We didn’t really know what was happening in the hospital but we came home again and he didn’t really know why he’d had the Xray or anything.
So when he spoke to the nurse at (the hospice), she sort of went through it all with him and talked him through it and that was really informative and helpful

There was a feeling that communication was better in specialist palliative care compared to the general National Health Service (NHS).

Olivia (carer): ‘There is an awful lot to be learned from the NHS about liaising and communications they could learn an awful lot from the way that the palliative care is operating and running’.

The carer also became an advocate for the patient’s needs and relaying information about symptoms and concerns to the healthcare professionals which the patient may not have themselves. Andrew (patient): ‘ I mean she (partner) tells (hospice nurse) things that I don’t’ cos‘ I mean I sometimes bottle quite a few things up and don’t say nothing but (partner) notices these things and then she will tell (hospice nurse) about them’.

This was also seen during a research interview, where the patient was willing for the carer to ‘tell the story’ on their behalf.

Mary (carer): Sorry I’m doing all the talking.
Martin (patient): Well no you need to because I’m useless.

We identified that patients had unmet needs in communicating about their condition ‘ Yeah, erm, again it’s, people are very reticent to use the word cancer. So they balk at saying the word’ (Wendy, patient)  and symptom experience with family and friends other than their regular carer.

Wendy (patient): I don’t know where she’s (my sister) at in terms of knowing about my symptoms and about the treatment I’m having, well no I do tell her actually, it’s not that I don’t but she has very bad arthritis…so I don’t push that too much because I’m thinking she’s actually in as much pain as I might be.’

This lack of communication could come from a position of wishing to protect the feelings of family members:

Wendy (patient): ‘Oh it’s been very difficult with family. You don’t know how much you want to tell them and you don’t know how far down the line you are anyway. I think over the years, I’ve been protecting my family’ )

Sometimes there were other important conversations that had not been held with family members.

Martin (patient): ‘I suppose my point in bringing up was because they’re particularly good kids and they are particularly, although I wouldn’t like them to hear me say it but they are, very good’ .

The work of medicines management

Medicines management was a time consuming and complex task, even for carers who has a background working in healthcare.

Sandra (carer): ‘I’m having to ring back my fourth phone call today to see is it a week off or have they forgotten to give him it. The communication isn’t great and I kind of think you know I’m kind of used to the NHS I’m, I know to ring and that sort of thing but I do think, I think if someone isn’t, got a health background or that sort of background there’s a lot of left to guesswork’ .

Commonly, the responsibility of managing the medicines could be delegated to the carer due to the side effects of the medication on the patient’s memory. It was felt that the patient would not have been able to manage by themselves. Mary (carer): ‘ a lot of the medication has made him not so aware, maybe a little bit muddled at times and his memory’s not as good as it was….you know he does forget quite easily so I wouldn’t, I have to say I wouldn’t trust him with his medication at all.’.

Carers took responsibility for ensuring medications were taken on time. As previously reported, this carer viewed this a joint endeavour with the patient.

Patricia (carer): I wake (patient) at 9 o’clock and make sure that she has her Lansoprazole and that she has her 12 hourly Longtech tablet. I generally am doing everything and as I say, we put the injection in at lunchtime every day and at night I remind her, not that she doesn’t, she doesn’t really need reminding but at 9 o’clock, I say have you had your tablets?’ .

The carer (who did not have a healthcare background) had developed an understanding of complex concepts such as the different modes of metabolism of medication for pain.

Patricia (carer): ‘So she’s now on a different set of pain relief which, the morphine was better but not better for her. So the pain killing stuff that she’s on is processed through the liver rather than through the kidneys and the kidney function has stabilised.’ .

Impact of COVID-19

Interviewees were asked about whether COVID-19 had impacted upon their experience. It seemed that for this selected group of patients and carers the impact was minimal.

Patricia (carer): ‘Can I just add that Covid seems to have, people have been complaining that this has stopped and that’s stopped whereas with Wendy her appointments, they’ve always wanted face to face and we’ve done phone appointments when it’s been appropriate and the care has been absolutely marvelous’.

Availably of hospice staff sometimes filled the gap in other services.

Kathleen (carer): ‘Because of lockdown and the virus and everything obviously all that (GP support) changed and you did start to feel a bit isolated and alone ‘cos you don’t always want to have to get in the car and drive to (hospital) for something if it’s not absolutely necessary and so therefore having someone else to talk to who knew more about things because obviously we’re learning as we go along Harry and I, it was very helpful’.

Problems were attributed to the general NHS system rather than being COVID-19 specific.

Sandra (carer): ‘I think as far as forthcoming information, I don’t think Covid has any bearing on that to be honest. You know, it just, I think it’s just an age-old problem in the NHS is communication.’ .

The close alignment of this patient and carer data with our HCP-based conceptual model provides additional reinforcement of the importance of multidisciplinary working and continuity of care in shaping symptom management experience. Indeed, the ability to see preferred member of general practices staff was recently reported as a factor associated with satisfaction with ends of life care in England [ 3 ].

Palliative care takes a holistic view of the patient and carer, the concerns of both being intertwined and interdependent. The observation that carers and patients viewed themselves as a single unit and talked about ‘we’ when describing the experience of symptoms and service referral, aligns with the dimension of the carer ‘living in the patients world’ and living in ‘symbiosis’ recently described by Borelli et al [ 4 ] and in earlier qualitative work with advanced cancer patients [ 5 ]. Carer opinion can be a close but not always perfect proxy of patient voice, even in this small sample we observed some discordance between patient and carer perception of symptom burden. However, carers were vitally important for communication with healthcare providers, relaying concerns, managing medication and generally advocating for the patient when they were unable or willing to do so. In the UK in 2022, the number of people living alone was 8.3 million. Since 2020, the number of people over 65 years old living alone has also increased [ 6 ]. Household composition is not a general indicator of wider social support networks, but these data do suggest that there could be a considerable number of people with palliative care needs without live-in carer support. This raises the questions of whether the experience of those living without a supportive carer can be equitable and how services might better facilitate this.

Home-based palliative care is thought to reduce symptom burden for patients with cancer [ 7 ]. To enable this, it is therefore vital that carers are adequately supported. Carer burden is a multifaceted experience, however our interview schedule specifically asked about symptom management experience. In agreement with the term ‘role strain’ in the review by Choi and Seo [ 8 ] we saw carers involvement in symptom management and in mediating communication between the patient and healthcare providers. Additional aspects reported by Choi et Seo include physical symptoms of the carer, psychological distress, impaired social relationships, spiritual distress, financial crisis, disruption of daily life and uncertainty [ 8 ] and these will not have all been probed by our interview topic guide.

Although in our original study HCPs talked about medicines from their perspective, the role of the carer was not discussed. Medicines management was an important way that carers facilitated effective symptom management but is a complex task. One carer commented: ‘I have to say that would be a nightmare if I wasn’t a nurse by background’ . Our data on the difficulties with medicine management are not novel and closely mirror the report of Pollock et al., [ 9 ]. Our findings echo and support their conclusions that managing medicine at home during end-of-life care could be improved by reducing the work of medicines management and improving co-ordination and communication in health care and we echo their calls for further research in the area.

We identified that patients and carers viewed mediating communication as an important role for healthcare professionals. This could be enabling communication between patients and carers and other healthcare professionals, for example arranging follow-up care or explaining information received. There was also a need for better communication between patients and their family members. As reviewed and synthesised by Murray et al., (2014) the importance of effective communication in palliative care has been long recognised [ 10 ]. In our study, an opportunity for HCPs to facilitate better communication about symptom experience between patients and their wider family was identified. Our previous survey of English hospices found that healthcare professionals, particularly nurses and allied health professionals felt that they needed more training in basic and advanced communication skills [ 11 ]. Having relevant experience and if the appropriate training was provided, staff may be well placed to support patients with developing an approach to these potentially difficult conversations. Participants were offered a choice of joint or individual interviews, but most chose to be interviewed as a dyad. It is possible that being interviewed as a pair may have altered the information disclosed. Although the aim was to discuss factors that impacted upon effective symptom management, discussions at times deviated to a more general appraisal of a participant’s experiences and all data collected may not be relevant to the research question.

When data was collected that lead to the development of the HCP-based model of effective symptom management (May to November 2019) a global pandemic was unforeseen. At the time of the patient and carer interview described here (October to December 2020), COVID-19 restrictions were in place in the UK. The patients and carers we interviewed were already receiving specialist palliative care support as outpatients. For these individuals it appeared that the impact of COVID-19 pandemic had had minimal impact on their care. The availability and reassurance of telephone support from hospice staff seemed in part to ameliorate the reduced support available from other services such as GPs. This contrasts sharply with the negative impact of COVID-19 on the experience of patients and carers in the more immediate end of life phase [ 12 ], receiving oncology care [ 13 ] or with cancer more generally [ 14 ]. Selection bias is likely as patients and carers with the capacity and willingness to participate in our research study possibly reflect those where the illness is in a more stable phase and immediate needs were being met. Indeed, participants talked about difficulties before referral to specialist palliative care and with other services but were overwhelmingly positive about the support currently being provided by the hospice.

Limitations

Due to the constraints of conducting a research study during the COVID-19 lockdown, more purposive sampling was not possible, this led to a lack of diversity in our sample. All participants identified themselves as of white British or white Scottish ethnicity which potentially means issues related to diverse ethnicities were not captured. All the patients who participated (and the non-participating patient whose carer was interviewed) lived with another person and had carer/family support. The experience of those managing their symptoms in isolation was therefore not captured. All participants were currently accessing support from a single hospice, the experience of those not yet receiving specialist support or receiving support from a different organisation may differ. The sample were diverse in age and included males and females, but all carers were female. Demographic information was not collected on socioeconomic background. COVID-19 restrictions necessitated the use of telephone interviews which may have lost subtle communications cues such as body language or conversely may have facilitated candid description. The transcripts do suggest that participants felt comfortable to tell their experience and they mostly spoke freely with limited prompting. One participant mentioned that he found it very difficult to leave the house, and therefore a telephone interview might have facilitated his inclusion. In some interviews more data was derived from the opinion of the carer than the patient, with the pair agreeing that the carer took responsibility for many tasks involved in managing the condition. We cannot be certain that carer interpretation accurately matches patient experience for all symptoms [ 15 ].

We set out to answer the question; does a healthcare professional data derived model represent the experience of patients and carers of people with advanced cancer? Overall, the answer was yes, as our healthcare professional based conceptual model of effective symptom management aligned well with the experience of patients with advanced cancer and their carers. Domains that did not align were those specifically related to professionals; experience (of staff), training (of staff), guidelines and evidence, psychological support (for staff) and the provision of formal psychological support (for patients), a resource patients and carers might be unaware of. Additional domains of carer burden, communication, medicine management and the impact of COVID-19 were identified. We make four recommendations arising from this research.

Routine appraisal and acknowledgement of carer burden, medicine management tasks and previous experience in healthcare roles.

Increased access to communication skills training for staff caring for palliative care patients and their families.

Review of patient communication needs with support provided where needed.

Further research into the symptom management experience of those living alone and exploration of how these people can be better supported.

Availability of data and materials

Original recordings generated and analysed during the current study are not publicly available due to protection of confidentiality. Anonymised transcripts with identifiable information removed may be available from the corresponding author on reasonable request.

Abbreviations

Coronavirus disease 2019

Healthcare professional

National Health Service

United Kingdom

Chapman EJ, Pini S, Edwards Z, Elmokhallalati Y, Murtagh FEM, Bennett MI. Conceptualising effective symptom management in palliative care: a novel model derived from qualitative data. BMC Palliat Care. 2022;21(1):17.

Article   PubMed   PubMed Central   Google Scholar  

Cosgrove D, Connolly M, Monnery D. Holistic care needs in an inpatient oncology unit: patients versus professionals. BMJ Support Palliat Care. 2023. Published Online First: https://doi.org/10.1136/spcare-2023-004617 .

ElMokhallalati Y, Chapman E, Relton SD, Bennett MI, Ziegler L. Characteristics of good home-based end-of-life care: analysis of 5-year data from a nationwide mortality follow-back survey in England. Br J Gen Pract. 2023;73(731):e443–50.

Borelli E, Bigi S, Potenza L, Gilioli F, Efficace F, Porro CA, et al. Caregiver’s quality of life in advanced cancer: validation of the construct in a real-life setting of early palliative care. Front Oncol. 2023;13:1213906.

McDonald J, Swami N, Pope A, Hales S, Nissim R, Rodin G, et al. Caregiver quality of life in advanced cancer: Qualitative results from a trial of early palliative care. Palliat Med. 2018;32(1):69–78.

Article   PubMed   Google Scholar  

Office for National Statistics (ONS). Statitical bulletin, Familes and households in the UK:2022 2023 [updated 18th May 2023. Available from: https://www.ons.gov.uk/peoplepopulationandcommunity/birthsdeathsandmarriages/families/bulletins/familiesandhouseholds/2022 .

Gomes B, Calanzani N, Curiale V, McCrone P, Higginson IJ. Effectiveness and cost-effectiveness of home palliative care services for adults with advanced illness and their caregivers. Cochrane Database Syst Rev. 2013;2013(6):CD007760. https://doi.org/10.1002/14651858.CD007760.pub2 .

Choi S, Seo J. Analysis of caregiver burden in palliative care: an integrated review. Nurs Forum. 2019;54(2):280–90.

Pollock K, Wilson E, Caswell G, Latif A, Caswell A, Avery A, et al. Family and health-care professionals managing medicines for patients with serious and terminal illness at home: a qualitative study. NIHR J Libr. 2021.

Atkin H, McDonald C, Murray CD. The communication experiences of patients with palliative care needs: a systematic review and meta-synthesis of qualitative findings. Palliat Support Care. 2015;13(2):369–83.

Paley CA, Keshwala V, Farfan Arango M, Hodgson E, Chapman EJ, Birtwistle J. Evaluating provision of psychological assessment and support in palliative care: A national survey of hospices in England. Progress in Palliative Care. 2024;32(1):11–21.

Article   Google Scholar  

Bailey C, Guo P, MacArtney J, Finucane A, Meade R, Swan S, Wagstaff E. “Palliative care is so much more than that”: a qualitative study exploring experiences of hospice staff and bereaved carers during the COVID-19 pandemic. Front Public Health. 2023;11:1139313.

de Joode K, Dumoulin DW, Engelen V, Bloemendal HJ, Verheij M, van Laarhoven HWM, et al. Impact of the coronavirus disease 2019 pandemic on cancer treatment: the patients’ perspective. Eur J Cancer. 2020;136:132–9.

Moraliyage H, De Silva D, Ranasinghe W, Adikari A, Alahakoon D, Prasad R, et al. Cancer in lockdown: impact of the COVID-19 pandemic on patients with cancer. Oncologist. 2021;26(2):e342–4.

Article   PubMed   CAS   Google Scholar  

McPherson CJ, Addington-Hall JM. Judging the quality of care at the end of life: can proxies provide reliable information? Soc Sci Med. 2003;56(1):95–109.

Download references

Acknowledgements

We are grateful to the patients and carers who in giving valuable time to share their experiences, made this research possible. We thank research nurses Kath Black and Angela Wray for their support with recruitment.

The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: this work was supported by Yorkshire Cancer Research programme grant L412, RESOLVE: “Improving health status and symptom experience for people living with advanced cancer”. The sponsor had no role in study design or the collection, analysis and interpretation of data; in the writing of the report; and in the decision to submit the article for publication.

Author information

Authors and affiliations.

Academic Unit of Palliative Care, Worsley Building, University of Leeds, Clarendon Way, LS2 9NL, UK

Emma J. Chapman, Carole A. Paley & Lucy E. Ziegler

Division of Psychological and Social Medicine, Worsley Building, University of Leeds, Clarendon Way, LS2 9NL, UK

You can also search for this author in PubMed   Google Scholar

Contributions

Original idea, EC and SP; Data collection, EC; Data Analysis, EC and CP; Data interpretation, All, Methodological oversight, SP and LZ; writing the manuscript, All. All authors contributed to the development of the updated conceptual model and approved the final submission.

Corresponding author

Correspondence to Emma J. Chapman .

Ethics declarations

Ethics approval and consent to participate.

Ethical approval was granted by a UK research ethics committee ( North of Scotland (2) Research Ethics Committee (20/NS/0086)). All participants gave informed consent for participation and for the use of their direct quotations in research publications.

Consent for publication

Participants gave consent for publication of their direct quotations.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1, supplementary material 2, supplementary material 3, supplementary material 4, supplementary material 5, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Chapman, E.J., Paley, C.A., Pini, S. et al. Updating a conceptual model of effective symptom management in palliative care to include patient and carer perspective: a qualitative study. BMC Palliat Care 23 , 208 (2024). https://doi.org/10.1186/s12904-024-01544-x

Download citation

Received : 08 March 2024

Accepted : 08 August 2024

Published : 19 August 2024

DOI : https://doi.org/10.1186/s12904-024-01544-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Symptom management
  • Conceptual model
  • Communication skills
  • Medicines management

BMC Palliative Care

ISSN: 1472-684X

method of data analysis in qualitative research

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

land-logo

Article Menu

method of data analysis in qualitative research

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Progress in remote sensing and gis-based fdi research based on quantitative and qualitative analysis.

method of data analysis in qualitative research

1. Introduction

2. research methods and data, 2.1. research methods, 2.2. data sources and screening, 2.3. data processing, 3. subject categories and publication trends, 3.1. subject evolution, 3.2. trends in the number and cited times of published papers, 4. the intellectual structure, 4.1. quantitative analysis, 4.2. qualitative analysis, 4.2.1. macro-environmental research at national, regional, and city scales, 4.2.2. global industrial development and layout, 4.2.3. research on global value chains, 4.2.4. micro-information geography of tncs, 4.2.5. internationalization and commercialization of geo-information industry, 4.2.6. multiple data and interdisciplinary approaches, 5. discussions and conclusions, data availability statement, acknowledgments, conflicts of interest.

1 (accessed on 13 July 2024). One date of launch is missing from the data set, but this has a minimal impact on the overall trend.
2 , accessed on 13 July 2024) is selected as the primary quantitative analysis tool in this paper.
  • Friedmann, J. The world city hypothesis. Dev. Chang. 1986 , 17 , 69–83. [ Google Scholar ] [ CrossRef ]
  • Sassen, S. The Global City: New York, London, Tokyo ; Princeton University Press: Princeton, NJ, USA, 2001. [ Google Scholar ]
  • Scott, A.J. Global City-Regions: Trends, Theory, Policy ; Oxford University Press: Oxford, UK, 2001. [ Google Scholar ]
  • Gregory, D.; Johnston, R.; Pratt, G.; Watts, M.; Whatmore, S. The Dictionary of Human Geography ; Wiley-Blackwell: New York, NY, USA, 2009; pp. 395–396, 771–772. [ Google Scholar ]
  • Dicken, P. Global Shift: Mapping the Changing Contours of the World Economy , 7th ed.; Guilford Press: New York, NY, USA, 2015. [ Google Scholar ]
  • Coe, N.M.; Hess, M.; Yeung, H.W.; Dicken, P.; Henderson, J. ‘Globalizing’regional development: A global production networks perspective. Trans. Inst. Br. Geogr. 2004 , 29 , 468–484. [ Google Scholar ] [ CrossRef ]
  • Baker, J.C.; Williamson, R.A. Satellite imagery activism: Sharpening the focus on tropical deforestation. Singap. J. Trop. Geogr. 2006 , 27 , 4–14. [ Google Scholar ] [ CrossRef ]
  • Charles, K.P.; Adolfo, C. Mascarenhas. Remote sensing in development. Science 1981 , 214 , 139–145. [ Google Scholar ]
  • Torraco, R.J. Writing integrative literature reviews: Guidelines and examples. Hum. Resour. Dev. Rev. 2005 , 4 , 356–367. [ Google Scholar ] [ CrossRef ]
  • Snyder, H. Literature review as a research methodology: An overview and guidelines. J. Bus. Res. 2019 , 104 , 333–339. [ Google Scholar ] [ CrossRef ]
  • Torraco, R.J. Writing integrative literature reviews: Using the past and present to explore the future. Hum. Resour. Dev. Rev. 2016 , 15 , 404–428. [ Google Scholar ] [ CrossRef ]
  • Watson, R.T.; Webster, J. Analysing the past to prepare for the future: Writing a literature review a roadmap for release 2.0. J. Decis. Syst. 2020 , 29 , 129–147. [ Google Scholar ] [ CrossRef ]
  • Onwuegbuzie, A.J.; Leech, N.L.; Collins, K.M.T. Qualitative analysis techniques for the review of the literature. Qual. Rep. 2012 , 17 , 1–28. [ Google Scholar ] [ CrossRef ]
  • Su, D.Z. GIS-based urban modelling: Practices, problems, and prospects. Int. J. Geogr. Inf. Sci. 1998 , 12 , 651–671. [ Google Scholar ] [ CrossRef ]
  • Rozas, L.W.; Klein, W.C. The Value and Purpose of the Traditional Qualitative Literature Review. J. Evid.-Based Soc. Work. 2010 , 7 , 387–399. [ Google Scholar ] [ CrossRef ]
  • Chen, C. CiteSpace II: Detecting and visualizing emerging trends and transient patterns in scientific literature. J. Am. Soc. Inf. Sci. Technol. 2006 , 57 , 359–377. [ Google Scholar ] [ CrossRef ]
  • Chen, C. Science map: A systematic review of the literature. J. Data Inf. Sci. 2017 , 2 , 1–40. [ Google Scholar ]
  • Davis, J.; Mengersen, K.; Bennett, S.; Mazerolle, L. Viewing systematic reviews and meta-analysis in social research through different lenses. SpringerPlus 2014 , 3 , 1–9. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Svensson, R.B.; Haggar, G.J.A.; Aurum, A.; Hooper, V.J. The application of geographical information systems to multinational finance corporations. Int. J. Bus. Syst. Res. 2009 , 3 , 437–455. [ Google Scholar ] [ CrossRef ]
  • Weber, P.; Chapman, D. Investing in geography: A GIS to support inward investment. Comput. Environ. Urban Syst. 2009 , 33 , 1–14. [ Google Scholar ] [ CrossRef ]
  • Horn, S.A.; Cross, A.R. Japanese production networks in India: Spatial distribution, agglomeration and industry effects. Asia Pac. Bus. Rev. 2016 , 22 , 612–640. [ Google Scholar ] [ CrossRef ]
  • Özdoğan, M.; Baird, I.G.; Dwyer, M.B. The role of remote sensing for understanding large-scale rubber concession expansion in Southern Laos. Land 2018 , 7 , 55. [ Google Scholar ] [ CrossRef ]
  • Wang, X.; Zhang, Y.; Zhang, J.; Fu, C.; Zhang, X. Progress in urban metabolism research and hotspot analysis based on CiteSpace analysis. J. Clean. Prod. 2021 , 281 , 125224. [ Google Scholar ] [ CrossRef ]
  • Chen, C.; Hu, Z.; Liu, S.; Tseng, H. Emerging trends in regenerative medicine: A scientometric analysis in CiteSpace. Expert Opin. Biol. Ther. 2012 , 12 , 593–608. [ Google Scholar ] [ CrossRef ]
  • Seto, K.C.; Kaufmann, R.K.; Woodcock, C.E. Landsat reveals China’s farmland reserves, but they’re vanishing fast. Nature 2000 , 406 , 121. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Seto, K.C.; Woodcock, C.E.; Song, C.; Huang, X.; Lu, J.; Kaufmann, R.K. Monitoring land-use change in the Pearl River Delta using Landsat TM. Int. J. Remote Sens. 2002 , 23 , 1985–2004. [ Google Scholar ] [ CrossRef ]
  • Tan, M.; Li, X.; Li, S.; Xin, L.; Wang, X.; Li, Q.; Li, W.; Li, Y.; Xiang, W. Modeling population density based on nighttime light images and land use data in China. Appl. Geogr. 2018 , 90 , 239–247. [ Google Scholar ] [ CrossRef ]
  • Zhao, M.; Cheng, W.; Zhou, C.; Li, M.; Huang, K.; Wang, N. Assessing spatiotemporal characteristics of urbanization dynamics in Southeast Asia using time series of DMSP/OLS nighttime light data. Remote Sens. 2018 , 10 , 47. [ Google Scholar ] [ CrossRef ]
  • Liu, H.Y.; Tang, Y.K.; Chen, X.L.; Poznanska, J. The determinants of Chinese outward FDI in countries along “One Belt One Road”. Emerg. Mark. Financ. Trade 2017 , 53 , 1374–1387. [ Google Scholar ] [ CrossRef ]
  • Du, J.; Zhang, Y. Does one belt one road initiative promote Chinese overseas direct investment? China Econ. Rev. 2018 , 47 , 189–205. [ Google Scholar ] [ CrossRef ]
  • Duan, F.; Ji, Q.; Liu, B.Y.; Fan, Y. Energy investment risk assessment for nations along China’s Belt & Road Initiative. J. Clean. Prod. 2018 , 170 , 535–547. [ Google Scholar ]
  • Huang, Y. Environmental risks and opportunities for countries along the Belt and Road: Location choice of China’s investment. J. Clean. Prod. 2019 , 211 , 14–26. [ Google Scholar ] [ CrossRef ]
  • Yuan, J.; Li, X.; Xu, C.; Zhao, C.; Liu, Y. Investment risk assessment of coal-fired power plants in countries along the Belt and Road initiative based on ANP-Entropy-TODIM method. Energy 2019 , 176 , 623–640. [ Google Scholar ] [ CrossRef ]
  • Hussain, J.; Zhou, K.; Guo, S.; Khan, A. Investment risk and natural resource potential in “Belt & Road Initiative” countries: A multi-criteria decision-making approach. Sci. Total Environ. 2020 , 723 , 137981. [ Google Scholar ] [ PubMed ]
  • Hashemizadeh, A.; Ju, Y.; Bamakan, S.M.H.; Le, H.P. Renewable energy investment risk assessment in belt and road initiative countries under uncertainty conditions. Energy 2021 , 214 , 118923. [ Google Scholar ] [ CrossRef ]
  • Dell’angelo, J.; D’odorico, P.; Rulli, M.C.; Marchand, P. The tragedy of the grabbed commons: Coercion and dispossession in the global land rush. World Dev. 2017 , 92 , 1–12. [ Google Scholar ] [ CrossRef ]
  • D’Odorico, P.; Rulli, M.C.; Dell’Angelo, J.; Davis, K.F. New frontiers of land and water commodification: Socio-environmental controversies of large-scale land acquisitions. Land Degrad. Dev. 2017 , 28 , 2234–2244. [ Google Scholar ] [ CrossRef ]
  • Davis, K.F.; Koo, H.I.; Dell’Angelo, J.; D’Odorico, P.; Estes, L.; Kehoe, L.J.; Kharratzadeh, M.; Kuemmerle, T.; Machava, D.; Pais, A.d.J.R.; et al. Tropical forest loss enhanced by large-scale land acquisitions. Nat. Geosci. 2020 , 13 , 482–488. [ Google Scholar ] [ CrossRef ]
  • Liu, B.; Xue, D.; Zheng, S. Evolution and Influencing Factors of Manufacturing Production Space in the Pearl River Delta—Based on the Perspective of Global City-Region. Land 2023 , 12 , 419. [ Google Scholar ] [ CrossRef ]
  • Tong, Y.; Zhou, H.; Jiang, L. Exploring the transition effects of foreign direct investment on the eco-efficiency of Chinese cities: Based on multi-source data and panel smooth transition regression models. Ecol. Indic. 2021 , 121 , 107073. [ Google Scholar ] [ CrossRef ]
  • Wei, G.; Bi, M.; Liu, X.; Zhang, Z.; He, B.J. Investigating the impact of multi-dimensional urbanization and FDI on carbon emissions in the belt and road initiative region: Direct and spillover effects. J. Clean. Prod. 2023 , 384 , 135608. [ Google Scholar ] [ CrossRef ]
  • Zou, Y.; Lu, Y.; Cheng, Y. The impact of polycentric development on regional gap of energy efficiency: A Chinese provincial perspective. J. Clean. Prod. 2019 , 224 , 838–851. [ Google Scholar ] [ CrossRef ]
  • Schneider, A.; Seto, K.C.; Webster, D.R. Urban growth in Chengdu, Western China: Application of remote sensing to assess planning and policy outcomes. Environ. Plan. B Plan. Des. 2005 , 32 , 323–345. [ Google Scholar ] [ CrossRef ]
  • Su, Y.; Lu, C.; Su, Y.; Wang, Z.; Huang, Y.; Yang, N. Spatio-temporal evolution of urban expansion based on a novel adjusted index and GEE: A case study of central plains urban agglomeration, China. Chin. Geogr. Sci. 2023 , 33 , 736–750. [ Google Scholar ] [ CrossRef ]
  • Cao, R.; Zhu, J.; Tu, W.; Li, Q.; Cao, J.; Liu, B.; Zhang, Q.; Qiu, G. Integrating aerial and street view images for urban land use classification. Remote Sens. 2018 , 10 , 1553. [ Google Scholar ] [ CrossRef ]
  • Tu, W.; Hu, Z.; Li, L.; Cao, J.; Jiang, J.; Li, Q.; Li, Q. Portraying urban functional zones by coupling remote sensing imagery and human sensing data. Remote Sens. 2018 , 10 , 141. [ Google Scholar ] [ CrossRef ]
  • Yu, D.; Wei, Y.D. Spatial data analysis of regional development in Greater Beijing, China, in a GIS environment. Pap. Reg. Sci. 2008 , 87 , 97–119. [ Google Scholar ] [ CrossRef ]
  • Cao, H.; Liu, J.; Chen, J.; Gao, J.; Wang, G.; Zhang, W. Spatiotemporal patterns of urban land use change in typical cities in the Greater Mekong Subregion (GMS). Remote Sens. 2019 , 11 , 801. [ Google Scholar ] [ CrossRef ]
  • Krylov, V.A.; Kenny, E.; Dahyot, R. Automatic discovery and geotagging of objects from street view imagery. Remote Sens. 2018 , 10 , 661. [ Google Scholar ] [ CrossRef ]
  • Huang, X.; Yang, J.; Li, J.; Wen, D. Urban functional zone mapping by integrating high spatial resolution nighttime light and daytime multi-view imagery. ISPRS J. Photogramm. Remote Sens. 2021 , 175 , 403–415. [ Google Scholar ] [ CrossRef ]
  • Müller, M.F.; Penny, G.; Niles, M.T.; Ricciardi, V.; Chiarelli, D.D.; Davis, K.F.; Dell’angelo, J.; D’odorico, P.; Rosa, L.; Rulli, M.C.; et al. Impact of transnational land acquisitions on local food security and dietary diversity. Proc. Natl. Acad. Sci. USA 2021 , 118 , e2020535118. [ Google Scholar ] [ CrossRef ]
  • Liu, B.; Xue, D.; Tan, Y. Deciphering the manufacturing production space in global city-regions of developing countries—A case of Pearl River Delta, China. Sustainability 2019 , 11 , 6850. [ Google Scholar ] [ CrossRef ]
  • Zhang, P.; Yang, X.; Chen, H.; Zhao, S. Matching relationship between urban service industry land expansion and economy growth in China. Land 2023 , 12 , 1139. [ Google Scholar ] [ CrossRef ]
  • Cho, K.; Goldstein, B.; Gounaridis, D.; Newell, J.P. Hidden risks of deforestation in global supply chains: A study of natural rubber flows from Sri Lanka to the United States. J. Clean. Prod. 2022 , 349 , 131275. [ Google Scholar ] [ CrossRef ]
  • Shi, F.; Xu, H.; Hsu, W.L.; Lee, Y.C.; Zhu, J. Spatial pattern and influencing factors of outward foreign direct investment enterprises in the Yangtze River Economic Belt of China. Information 2021 , 12 , 381. [ Google Scholar ] [ CrossRef ]
  • Yin, Y.; Liu, Y. Investment suitability assessment based on B&R symbiotic system theory: Location choice of China’s OFDI. Systems 2022 , 10 , 195. [ Google Scholar ] [ CrossRef ]
  • Liu, C.; Yan, S. Transnational technology transfer network in China: Spatial dynamics and its determinants. J. Geogr. Sci. 2022 , 32 , 2383–2414. [ Google Scholar ] [ CrossRef ]
  • Xu, Y.; Zuo, X.L. Technology roadmapping of geo-spatial information and application services industry in China. Forum Sci. Technol. China 2016 , 4 , 30–36. [ Google Scholar ]
  • Robinson, D.K.R.; Mazzucato, M. The evolution of mission-oriented policies: Exploring changing market creating policies in the US and European space sector. Res. Policy 2019 , 48 , 936–948. [ Google Scholar ] [ CrossRef ]
  • Auque, F. The space industry in the context of the European aeronautics and defence sector. Air Space Eur. 2000 , 2 , 22–25. [ Google Scholar ] [ CrossRef ]
  • George, K.W. The economic impacts of the commercial space industry. Space Policy 2019 , 47 , 181–186. [ Google Scholar ] [ CrossRef ]
  • von Maurich, O.; Golkar, A. Data authentication, integrity and confidentiality mechanisms for federated satellite systems. Acta Astronaut. 2018 , 149 , 61–76. [ Google Scholar ] [ CrossRef ]
  • Zelnio, R.J. Whose jurisdiction over the US commercial satellite industry? Factors affecting international security and competition. Space Policy 2007 , 23 , 221–233. [ Google Scholar ] [ CrossRef ]
  • Naqvi, S.A.A.; Naqvi, R.Z. Geographical information systems (GIS) in industry 4.0: Revolution for sustainable development. In Handbook of Smart Materials, Technologies, and Devices: Applications of Industry 4.0 ; Springer International Publishing: Cham, Switzerland, 2021; pp. 1–27. [ Google Scholar ]
  • Kleemann, J.; Baysal, G.; Bulley, H.N.N.; Fürst, C. Assessing driving forces of land use and land cover change by a mixed-method approach in north-eastern Ghana, West Africa. J. Environ. Manag. 2017 , 196 , 411–442. [ Google Scholar ] [ CrossRef ]
  • Chen, W.; Huang, H.; Dong, J.; Zhang, Y.; Tian, Y.; Yang, Z. Social functional mapping of urban green space using remote sensing and social sensing data. ISPRS J. Photogramm. Remote Sens. 2018 , 146 , 436–452. [ Google Scholar ] [ CrossRef ]
  • Seto, K.C.; Kaufmann, R.K. Modeling the drivers of urban land use change in the Pearl River Delta, China: Integrating remote sensing with socioeconomic data. Land Econ. 2003 , 79 , 106–121. [ Google Scholar ] [ CrossRef ]

Click here to enlarge figure

StepDescriptionDetails
1Topic identificationIdentify a knowledge domain using the broadest possible terms
2Data collectionCollect data of commonly used sources of scientific literature
3Terms extractExtract research front terms
4Time slicingBuild time series models over time
5Outcome layoutAnalyze domains and generate visualizations
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Li, Z. Progress in Remote Sensing and GIS-Based FDI Research Based on Quantitative and Qualitative Analysis. Land 2024 , 13 , 1313. https://doi.org/10.3390/land13081313

Li Z. Progress in Remote Sensing and GIS-Based FDI Research Based on Quantitative and Qualitative Analysis. Land . 2024; 13(8):1313. https://doi.org/10.3390/land13081313

Li, Zifeng. 2024. "Progress in Remote Sensing and GIS-Based FDI Research Based on Quantitative and Qualitative Analysis" Land 13, no. 8: 1313. https://doi.org/10.3390/land13081313

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals

You are here

  • Volume 33, Issue 9
  • Patient safety in remote primary care encounters: multimethod qualitative study combining Safety I and Safety II analysis
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • Rebecca Payne 1 ,
  • Aileen Clarke 1 ,
  • Nadia Swann 1 ,
  • Jackie van Dael 1 ,
  • Natassia Brenman 1 ,
  • Rebecca Rosen 2 ,
  • Adam Mackridge 3 ,
  • Lucy Moore 1 ,
  • Asli Kalin 1 ,
  • Emma Ladds 1 ,
  • Nina Hemmings 2 ,
  • Sarah Rybczynska-Bunt 4 ,
  • Stuart Faulkner 1 ,
  • Isabel Hanson 1 ,
  • Sophie Spitters 5 ,
  • http://orcid.org/0000-0002-7758-8493 Sietse Wieringa 1 , 6 ,
  • Francesca H Dakin 1 ,
  • Sara E Shaw 1 ,
  • Joseph Wherton 1 ,
  • Richard Byng 4 ,
  • Laiba Husain 1 ,
  • http://orcid.org/0000-0003-2369-8088 Trisha Greenhalgh 1
  • 1 Nuffield Department of Primary Care Health Sciences , University of Oxford , Oxford , UK
  • 2 Nuffield Trust , London , UK
  • 3 Betsi Cadwaladr University Health Board , Bangor , UK
  • 4 Peninsula Schools of Medicine and Dentistry , University of Plymouth , Plymouth , UK
  • 5 Wolfson Institute of Population Health , Queen Mary University of London , London , UK
  • 6 Sustainable Health Unit , University of Oslo , Oslo , Norway
  • Correspondence to Professor Trisha Greenhalgh; trish.greenhalgh{at}phc.ox.ac.uk

Background Triage and clinical consultations increasingly occur remotely. We aimed to learn why safety incidents occur in remote encounters and how to prevent them.

Setting and sample UK primary care. 95 safety incidents (complaints, settled indemnity claims and reports) involving remote interactions. Separately, 12 general practices followed 2021–2023.

Methods Multimethod qualitative study. We explored causes of real safety incidents retrospectively (‘Safety I’ analysis). In a prospective longitudinal study, we used interviews and ethnographic observation to produce individual, organisational and system-level explanations for why safety and near-miss incidents (rarely) occurred and why they did not occur more often (‘Safety II’ analysis). Data were analysed thematically. An interpretive synthesis of why safety incidents occur, and why they do not occur more often, was refined following member checking with safety experts and lived experience experts.

Results Safety incidents were characterised by inappropriate modality, poor rapport building, inadequate information gathering, limited clinical assessment, inappropriate pathway (eg, wrong algorithm) and inadequate attention to social circumstances. These resulted in missed, inaccurate or delayed diagnoses, underestimation of severity or urgency, delayed referral, incorrect or delayed treatment, poor safety netting and inadequate follow-up. Patients with complex pre-existing conditions, cardiac or abdominal emergencies, vague or generalised symptoms, safeguarding issues, failure to respond to previous treatment or difficulty communicating seemed especially vulnerable. General practices were facing resource constraints, understaffing and high demand. Triage and care pathways were complex, hard to navigate and involved multiple staff. In this context, patient safety often depended on individual staff taking initiative, speaking up or personalising solutions.

Conclusion While safety incidents are extremely rare in remote primary care, deaths and serious harms have resulted. We offer suggestions for patient, staff and system-level mitigations.

  • Primary care
  • Diagnostic errors
  • Safety culture
  • Qualitative research
  • Prehospital care

Data availability statement

Data are available upon reasonable request. Details of real safety incidents are not available for patient confidentiality reasons. Requests for data on other aspects of the study from other researchers will be considered.

This is an open access article distributed in accordance with the Creative Commons Attribution 4.0 Unported (CC BY 4.0) license, which permits others to copy, redistribute, remix, transform and build upon this work for any purpose, provided the original work is properly cited, a link to the licence is given, and indication of whether changes were made. See:  https://creativecommons.org/licenses/by/4.0/ .

https://doi.org/10.1136/bmjqs-2023-016674

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

WHAT IS ALREADY KNOWN ON THIS TOPIC

Safety incidents are extremely rare in primary care but they do happen. Concerns have been raised about the safety of remote triage and remote consultations.

WHAT THIS STUDY ADDS

Rare safety incidents (involving death or serious harm) in remote encounters can be traced back to various clinical, communicative, technical and logistical causes. Telephone and video encounters in general practice are occurring in a high-risk (extremely busy and sometimes understaffed) context in which remote workflows may not be optimised. Front-line staff use creativity and judgement to help make care safer.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

As remote modalities become mainstreamed in primary care, staff should be trained in the upstream causes of safety incidents and how they can be mitigated. The subtle and creative ways in which front-line staff already contribute to safety culture should be recognised and supported.

Introduction

In early 2020, remote triage and remote consultations (together, ‘remote encounters’), in which the patient is in a different physical location from the clinician or support staff member, were rapidly expanded as a safety measure in many countries because they eliminated the risk of transmitting COVID-19. 1–4 But by mid-2021, remote encounters had begun to be depicted as potentially unsafe because they had come to be associated with stories of patient harm, including avoidable deaths and missed cancers. 5–8

Providing triage and clinical care remotely is sometimes depicted as a partial solution to the system pressures facing primary healthcare in many countries, 9–11 including rising levels of need or demand, the ongoing impact of the COVID-19 pandemic and workforce challenges (especially short-term or longer-term understaffing). In this context, remote encounters may be an important component of a mixed-modality health service when used appropriately alongside in-person contacts. 12 13 But this begs the question of what ‘appropriate’ and ‘safe’ use of remote modalities in a primary care context is. Safety incidents (defined as ‘any unintended or unexpected incident which could have, or did, lead to harm for one or more patients receiving healthcare 14 ’) are extremely rare in primary healthcare consultations generally, 15 16 in-hours general practice telephone triage 17 and out-of-hours primary care. 18 But the recent widespread expansion of remote triage and remote consulting in primary care means that a wider range of patients and conditions are managed remotely, making it imperative to re-examine where the risks lie.

Theoretical approaches to safety in healthcare fall broadly into two traditions. 19 ‘Safety I’ studies focus on what went wrong. Incident reports are analysed to identify ‘root causes’ and ‘safety gaps’, and recommendations are made to reduce the chance that further similar incidents will happen in the future. 20 Such studies, undertaken in isolation, tend to lead to a tightening of rules, procedures and protocols. ‘Safety II’ studies focus on why, most of the time, things do not go wrong. Ethnography and other qualitative methods are employed to study how humans respond creatively to unique and unforeseen situations, thereby preventing safety incidents most of the time. 19 Such studies tend to show that actions which achieve safety are highly context specific, may entail judiciously breaking the rules and require human qualities such as courage, initiative and adaptability. 21 Few previous studies have combined both approaches.

In this study, we aimed to use Safety I methods to learn why safety incidents occur (although rarely) in remote primary care encounters and also apply Safety II methods to examine the kinds of creative actions taken by front-line staff that contribute to a safety culture and thereby prevent such incidents.

Study design and origins

Multimethod qualitative study across UK, including incident analysis, longitudinal ethnography and national stakeholder interviews.

The idea for this safety study began during a longitudinal ethnographic study of 12 general practices across England, Scotland and Wales as they introduced (and, in some cases, subsequently withdrew) various remote and digital modalities. Practices were selected for maximum diversity in geographical location, population served and digital maturity and followed from mid-2021 to end 2023 using staff and patient interviews and in-person ethnographic visits. The study protocol, 22 baseline findings 23 and a training needs analysis 24 have been published. To provide context for our ethnography, we interviewed a sample of national stakeholders in remote and digital primary care, including out-of-hours providers running telephone-led services, and held four online multistakeholder workshops, one of which was on the theme of safety, for policymakers, clinicians, patients and other parties. Early data from this detailed qualitative work revealed staff and patient concerns about the safety of remote encounters but no actual examples of harm.

To explore the safety theme further, we decided to take a dual approach. First, following Safety I methodology for the study of rare harms, 20 we set out to identify and analyse a sample of safety incidents involving remote encounters. These were sourced from arm’s-length bodies (NHS England, NHS Resolution, Healthcare Safety Investigation Branch) and providers of healthcare at scale (health boards, integrated care systems and telephone advice services), since our own small sample had not identified any of these rare occurrences. Second, we extended our longitudinal ethnographic design to more explicitly incorporate Safety II methodology, 19 allowing us to examine safety culture and safety practices in our 12 participating general practices, especially the adaptive work done by staff to avert potential safety incidents.

Data sources and management

Table 1 summarises the data sources.

  • View inline

Summary of data sources

The Safety I dataset (rows 2-5) consisted of 95 specific incident reports, including complaints submitted to the main arm’s-length NHS body in England, NHS England, between 2020 and 2023 (n=69), closed indemnity claims that had been submitted to a national indemnity body, NHS Resolution, between 2015 and 2023 (n=16), reports from an urgent care telephone service in Wales (NHS 111 Wales) between 2020 and 2023 (n=6) and a report on an investigation of telephone advice during the COVID-19 crisis between 2020 and 2022 7 (n=4). These 95 incidents were organised using Microsoft Excel spreadsheets.

The Safety II dataset (rows 6-10) consisted of extracts from fieldnotes, workshop transcripts and interviews collected over 2 years, stored and coded on NVivo qualitative software. These were identified by searching for text words and codes (e.g. ‘risk’, ‘safety’, ‘incident’) and by asking researchers-in-residence, who were closely familiar with practices, to highlight safety incidents involving harm and examples of safety-conscious work practices. This dataset included over 100 formal interviews and numerous on-the-job interviews with practice staff, plus interviews with a sample of 10 GP (general practitioner) trainers and 10 GP trainees (penultimate row of table 1 ) and with six clinical safety experts identified through purposive sampling from government, arm’s-length bodies and health boards (bottom row of table 1 ).

Data analysis

We analysed incident reports, interview data and ethnographic fieldnotes using thematic analysis as described by Braun and Clarke. 25 These authors define a theme as an important, broad pattern in a set of qualitative data, which can (where necessary) be further refined using coding.

Themes in the incident dataset were identified by five steps. First, two researchers (both medically qualified) read each source repeatedly to gain familiarity. Second, those researchers worked independently using Braun and Clarke’s criterion (‘whether it captures something important in relation to the overall research question’—p 82 25 ) to identify themes. Third, they discussed their initial interpretations with each other and resolved differences through discussion. Fourth, they extracted evidence from the data sources to illustrate and refine each theme. Finally, they presented their list of themes along with illustrative examples to the wider team. Cases used to illustrate themes were systematically fictionalised by changing age, randomly allocating gender and altering clinical details. 26 For example, an acute appendicitis could be changed to acute diverticulitis if the issue was a missed acute abdomen.

These safety themes were then used to sensitise us to seek relevant (confirming and disconfirming) material from our ethnographic and interview datasets. For example, the theme ‘poor communication’ (and subthemes such as ‘failure to seek further clarification’ within this) promoted us to look for examples in our stakeholder interviews of poor communication offered as a cause of safety incidents and examples in our ethnographic notes of good communication (including someone seeking clarification). We used these wider data to add nuance to the initial list of themes.

As a final sense-checking step, the draft findings from this study were shown to each of the six safety experts in our sample and refined in the light of their comments (in some cases, for example, they considered the case to have been overfictionalised, thereby losing key clinical messages; they also gave additional examples to illustrate some of the themes we had identified, which underlined the importance of those themes).

Overview of dataset

The dataset ( table 1 ) consisted of 95 incident reports (see fictionalised examples in box 1 ), plus approximately 400 pages of extracts from interviews, ethnographic fieldnotes and workshop discussions, including situated safety practices (see examples in box 2 ), plus strategic insights relating to policy, organisation and planning of services. Notably, almost all incidents related to telephone calls.

Examples of safety incidents involving death or serious harm in remote encounters

All these cases have been systematically fictionalised as explained in the text.

Case 1 (death)

A woman in her 70s experiencing sudden breathlessness called her GP (general practitioner) surgery. The receptionist answered the phone and informed her that she would place her on the doctor’s list for an emergency call-back. The receptionist was distracted by a patient in the waiting room and did not do so. The patient deteriorated and died at home that afternoon.—NHS Resolution case, pre-2020

Case 2 (death)

An elderly woman contacted her GP after a telephone contact with the out-of-hours service, where constipation had been diagnosed. The GP prescribed laxatives without seeing the patient. The patient self-presented to the emergency department (ED) the following day in obstruction secondary to an incarcerated hernia and died in the operating theatre.—NHS Resolution case, pre-2020

Case 3 (risk to vulnerable patients)

A daughter complained that her elderly father was unable to access his GP surgery as he could not navigate the online triage system. When he phoned the surgery directly, he was directed back to the online system and told to get a relative to complete the form for him.—Complaint to NHS England, 2021

Case 4 (harm)

A woman in her first pregnancy at 28 weeks’ gestation experiencing urinary incontinence called NHS 111. She was taken down by a ‘urinary problems’ algorithm. Both the call handler and the subsequent clinician failed to recognise that she had experienced premature rupture of membranes. She later presented to the maternity department in active labour, and the opportunity to give early steroids to the premature infant was missed.—NHS Resolution case, pre-2020

Case 5 (death)

A doctor called about a 16-year-old girl with lethargy, shaking, fever and poor oral intake who had been unwell for 5 days. The doctor spoke to her older sister and advised that the child had likely glandular fever and should rest. When the parents arrived home, they called an ambulance but the child died of sepsis in the ED.—NHS Resolution case, pre-2020

Case 6 (death)

A 40-year-old woman, 6 weeks after caesarean section, contacted her GP due to shortness of breath, increased heart rate and dry cough. She was advised to get a COVID test and to dial 111 if she developed a productive cough, fever or pain. The following day she collapsed and died at home. The postmortem revealed a large pulmonary embolus. On reviewing the case, her GP surgery felt that had she been seen face to face, her oxygen saturations would have been measured and may have led to suspicion of the diagnosis.—NHS Resolution case, 2020

Case 7 (death)

A son complained that his father with diabetes and chronic kidney disease did not receive any in-person appointments over a period of 1 year. His father went on to die following a leg amputation arising from a complication of his diabetes.—Complaint to NHS England, 2021

Case 8 (death)

A 73-year-old diabetic woman with throat pain and fatigue called the surgery. She was diagnosed with a viral illness and given self-care advice. Over the next few days, she developed worsening breathlessness and was advised to do a COVID test and was given a pulse oximeter. She was found dead at home 4 days later. Postmortem found a blocked coronary artery and a large amount of pulmonary oedema. The cause of death was myocardial infarction and heart failure.—NHS Resolution case, pre-2020

Case 9 (harm)

A patient with a history of successfully treated cervical cancer developed vaginal bleeding. A diagnosis of fibroids was made and the patient received routine care by telephone over the next few months until a scan revealed a local recurrence of the original cancer.—Complaint to NHS England, 2020

Case 10 (death)

A 65-year-old female smoker with chronic cough and breathlessness presented to her GP. She was diagnosed with chronic obstructive pulmonary disease (COPD) and monitored via telephone. She did not respond to inhalers or antibiotics but continued to receive telephone monitoring without further investigation. Her symptoms continued to worsen and she called an ambulance. In the ED, she was diagnosed with heart failure and died soon after.—Complaint to NHS England, 2021

Case 11 (harm)

A 30-year-old woman presented with intermittent episodes of severe dysuria over a period of 2 years. She was given repeated courses of antibiotics but no urine was sent for culture and she was not examined. After 4 months of symptoms, she saw a private GP and was diagnosed with genital herpes.—Complaint to NHS England, 2021

Case 12 (harm)

There were repeated telephone consultations about a baby whose parents were concerned that the child was having a funny colour when feeding or crying. The 6-week check was done by telephone and at no stage was the child seen in person. Photos were sent in, but the child’s dark skin colour meant that cyanosis was not easily apparent to the reviewing clinician. The child was subsequently admitted by emergency ambulance where a significant congenital cardiac abnormality was found.—Complaint to NHS England, 2020 1

Case 13 (harm)

A 35-year-old woman in her third trimester of pregnancy had a telephone appointment with her GP about a breast lump. She was informed that this was likely due to antenatal breast changes and was not offered an in-person appointment. She attended after delivery and was referred to a breast clinic where a cancer was diagnosed.—Complaint to NHS England, 2020

Case 14 (harm)

A 63-year-old woman with a variety of physical symptoms including diarrhoea, hip girdle pain, palpitations, light-headedness and insomnia called her surgery on multiple occasions. She was told her symptoms were likely due to anxiety, but was diagnosed with stage 4 ovarian cancer and died soon after.—Complaint to NHS England, 2021

Case 15 (death)

A man with COPD with worsening shortness of breath called his GP surgery. The staff asked him if it was an emergency, and when the patient said no, scheduled him for 2 weeks later. The patient died before the appointment.—Complaint to NHS England, 2021

Examples of safety practices

Case 16 (safety incident averted by switching to video call for a sick child)

‘I’ve remembered one father that called up. Really didn’t seem to be too concerned. And was very much under-playing it and then when I did a video call, you know this child… had intercostal recession… looked really, really poorly. And it was quite scary actually that, you know, you’d had the conversation and if you’d just listened to what Dad was saying, actually, you probably wouldn’t be concerned.’—GP (general practitioner) interview 2022

Case 17 (‘red flag’ spotted by support staff member)

A receptionist was processing routine ‘administrative’ encounters sent in by patients using AccuRx (text messaging software). She became concerned about a sick note renewal request from a patient with a mental health condition. The free text included a reference to feeling suicidal, so the receptionist moved the request to the ‘red’ (urgent call-back) list. In interviews with staff, it became apparent that there had recently been heated discussion in the practice about whether support staff were adding ‘too many’ patients to the red list. After discussing cases, the doctors concluded that it should be them, not the support staff, who should absorb the risk in uncertain cases. The receptionist said that they had been told: ‘if in doubt, put it down as urgent and then the duty doctor can make a decision.’—Ethnographic fieldnotes from general practice 2023

Case 18 (‘check-in’ phone call added on busy day)

A duty doctor was working through a very busy Monday morning ‘urgent’ list. One patient had acute abdominal pain, which would normally have triggered an in-person appointment, but there were no slots and hard decisions were being made. This patient had had the pain already for a week, so the doctor judged that the general rule of in-person examination could probably be over-ridden. But instead of simply allocating to a call-back, the doctor asked a support staff member to phone the patient, ask ‘are you OK to wait until tomorrow?’ and offer basic safety-netting advice.—Ethnographic fieldnotes from general practice 2023

Case 19 (receptionist advocating on behalf of ‘angry’ walk-in patient)

A young Afghan man with limited English walked into a GP surgery on a very busy day, ignoring the prevailing policy of ‘total triage’ (make contact by phone or online in the first instance). He indicated that he wanted a same-day in-person appointment for a problem he perceived as urgent. A heated exchange occurred with the first receptionist, and the patient accused her of ‘racism’. A second receptionist of non-white ethnicity herself noted the man’s distress and suspected that there may indeed be an urgent problem. She asked the first receptionist to leave the scene, saying she wanted to ‘have a chat’ with the patient (‘the colour of my skin probably calmed him down more than anything’). Through talking to the patient and looking through his record, she ascertained that he had an acute infection that likely needed prompt attention. She tried to ‘bend the rules’ and persuade the duty doctor to see the patient, conveying the clinical information but deliberately omitting the altercation. But the first receptionist complained to the doctor (‘he called us racists’) and the doctor decided that the patient would not therefore be offered a same-day appointment. The second receptionist challenged the doctor (‘that’s not a reason to block him from getting care’). At this point, the patient cried and the second receptionist also became upset (‘this must be serious, you know’). On this occasion, despite her advocacy the patient was not given an immediate appointment.—Ethnographic fieldnotes from general practice 2022

Case 20 (long-term condition nurse visits ‘unengaged’ patients at home)

An advanced nurse practitioner talks of two older patients, each with a long-term condition, who are ‘unengaged’ and lacking a telephone. In this practice, all long-term condition reviews are routinely done by phone. She reflects that some people ‘choose not to have avenues of communication’ (ie, are deliberately not contactable), and that there may be reasons for this (‘maybe health anxiety or just old’). She has, on occasion, ‘turned up’ unannounced at the patient’s home and asked to come in and do the review, including bloods and other tests. She reflects that while most patients engage well with the service, ‘half my job is these patients who don’t engage very well.’—Ethnographic fieldnotes from digitally advanced general practice 2022

Case 21 (doctor over-riding patient’s request for telephone prescribing)

A GP trainee described a case of a 53-year-old first-generation immigrant from Pakistan, a known smoker with hypertension and diabetes. He had booked a telephone call for vomiting and sinus pain. There was no interpreter available but the man spoke some English. He said he had awoken in the night with pain in his sinuses and vomiting. All he wanted was painkillers for his sinuses. The story did not quite make sense, and the man ‘sounded unwell’. The GP told him he needed to come in and be examined. The patient initially resisted but was persuaded to come in. When the GP went to call him in, the man was visibly unwell and lying down in the waiting room. When seen in person, he admitted to shoulder pain. The GP sent him to accident and emergency (A&E) where a myocardial infarction was diagnosed.—Trainee interview 2023

Below, we describe the main themes that were evident in the safety incidents: a challenging organisational and system context, poor communication compounded by remote modalities, limited clinical information, patient and carer burden and inadequate training. Many safety incidents illustrated multiple themes—for example, poor communication and failures of clinical assessment or judgement and patient complexity and system pressures. In the detailed findings below, we illustrate why safety incidents occasionally occur and why they are usually avoided.

The context for remote consultations: system and operational challenges

Introduction of remote triage and expansion of remote consultations in UK primary care occurred at a time of unprecedented system stress (an understaffed and chronically under-resourced primary care sector, attempting to cope with a pandemic). 23 Many organisations had insufficient telephone lines or call handlers, so patients struggled to access services (eg, half of all calls to the emergency COVID-19 telephone service in March 2020 were never answered 7 ). Most remote consultations were by telephone. 27

Our safety incident dataset included examples of technically complex access routes which patients found difficult or impossible to navigate (case 3 in box 1 ) and which required non-clinical staff to make clinical or clinically related judgements (cases 4 and 15). Our ethnographic dataset contained examples of inflexible application of triage rules (eg, no face-to-face consultation unless the patient had already had a telephone call), though in other practices these rules could be over-ridden by staff using their judgement or asking colleagues. Some practices had a high rate of failed telephone call-backs (patient unobtainable).

High demand, staff shortages and high turnover of clinical and support staff made the context for remote encounters inherently risky. Several incidents were linked to a busy staff member becoming distracted (case 1). Telephone consultations, which tend to be shorter, were sometimes used in the hope of improving efficiency. Some safety incidents suggested perfunctory and transactional telephone consultations, with flawed decisions made on the basis of incomplete information (eg, case 2).

Many practices had shifted—at least to some extent—from a demand-driven system (in which every request for an appointment was met) to a capacity-driven one (in which, if a set capacity was exceeded, patients were advised to seek care elsewhere), though the latter was often used flexibly rather than rigidly with an expectation that some patients would be ‘squeezed in’. In some practices, capacity limits had been introduced to respond to escalation of demand linked to overuse of triage templates (eg, to inquire about minor symptoms).

As a result of task redistribution and new staff roles, a single episode of care for one problem often involved multiple encounters or tasks distributed among clinical and non-clinical staff (often in different locations and sometimes also across in-hours and out-of-hours providers). Capacity constraints in onward services placed pressure on primary care to manage risk in the community, leading in some cases to failure to escalate care appropriately (case 6).

Some safety incidents were linked to organisational routines that had not adapted sufficiently to remote—for example, a prescription might be issued but (for various reasons) it could not be transmitted electronically to the pharmacy. Certain urgent referrals were delayed if the consultation occurred remotely (a referral for suspected colon cancer, for example, would not be accepted without a faecal immunochemical test).

Training, supervising and inducting staff was more difficult when many were working remotely. If teams saw each other less frequently, relationship-building encounters and ‘corridor’ conversations were reduced, with knock-on impacts for individual and team learning and patient care. Those supervising trainees or allied professionals reported loss of non-verbal cues (eg, more difficult to assess how confident or distressed the trainee was).

Clinical and support staff regularly used initiative and situated judgement to compensate for an overall lack of system resilience ( box 1 ). Many practices had introduced additional safety measures such as lists of patients who, while not obviously urgent, needed timely review by a clinician. Case 17 illustrates how a rule of thumb ‘if in doubt, put it down as urgent’ was introduced and then applied to avert a potentially serious mental health outcome. Case 18 illustrates how, in the context of insufficient in-person slots to accommodate all high-risk cases, a unique safety-netting measure was customised for a patient.

Poor communication is compounded by remote modalities

Because sense data (eg, sight, touch, smell) are missing, 28 remote consultations rely heavily on the history. Many safety incidents were characterised by insufficient or inaccurate information for various reasons. Sometimes (cases 2, 5, 6, 8, 9, 10 and 11), the telephone consultation was too short to do justice to the problem; the clinician asked few or no questions to build rapport, obtain a full history, probe the patient’s answers for additional detail, confirm or exclude associated symptoms and inquire about comorbidities and medication. Video provided some visual cues but these were often limited to head and shoulders, and photographs were sometimes of poor quality.

Cases 2, 4, 5 and 9 illustrate the dangers of relying on information provided by a third party (another staff member or a relative). A key omission (eg, in case 5) was failing to ask why the patient was unable to come to the phone or answer questions directly.

Some remote triage conversations were conducted using an inappropriate algorithm. In case 4, for example, the call handler accepted a pregnant patient’s assumption that leaking fluid was urine when the problem was actually ruptured membranes. The wrong pathway was selected; vital questions remained unasked; and a skewed history was passed to (and accepted by) the clinician. In case 8, the patient’s complaint of ‘throat’ pain was taken literally and led to ‘viral illness’ advice, overlooking a myocardial infarction.

The cases in box 2 illustrate how staff compensated for communication challenges. In case 16, a GP plays a hunch that a father’s account of his child’s asthma may be inaccurate and converts a phone encounter to video, revealing the child’s respiratory distress. In case 19 (an in-person encounter but relevant because the altercation occurs partly because remote triage is the default modality), one receptionist correctly surmises that the patient’s angry demeanour may indicate urgency and uses her initiative and interpersonal skills to obtain additional clinical information. In case 20, a long-term condition nurse develops a labour-intensive workaround to overcome her elderly patients’ ‘lack of engagement’. More generally, we observed numerous examples of staff using both formal tools (eg, see ‘red list’ in case 17) and informal measures (eg, corridor chats) to pass on what they believed to be crucial information.

Remote consulting can provide limited clinical information

Cases 2 and 4–14 all describe serious conditions including congenital cyanotic heart disease, pulmonary oedema, sepsis, cancer and diabetic foot which would likely have been readily diagnosed with an in-person examination. While patients often uploaded still images of skin lesions, these were not always of sufficient quality to make a confident diagnosis.

Several safety incidents involved clinicians assuming that a diagnosis made on a remote consultation was definitive rather than provisional. Especially when subsequent consultations were remote, such errors could become ingrained, leading to diagnostic overshadowing and missed or delayed diagnosis (cases 2, 8, 9, 10, 11 and 13). Patients with pre-existing conditions (especially if multiple or progressive), the very young and the elderly were particularly difficult to assess by telephone (cases 1, 2, 8, 10, 12 and 16). Clinical conditions difficult to assess remotely included possible cardiac pain (case 8), acute abdomen (case 2), breathing difficulties (cases 1, 6 and 10), vague and generalised symptoms (cases 5 and 14) and symptoms which progressed despite treatment (cases 9, 10 and 11). All these categories came up repeatedly in interviews and workshops as clinically risky.

Subtle aspects of the consultation which may have contributed to safety incidents in a telephone consultation included the inability to fully appraise the patient’s overall health and well-being (including indicators relevant to mental health such as affect, eye contact, personal hygiene and evidence of self-harm), general demeanour, level of agitation and concern, and clues such as walking speed and gait (cases 2, 5, 6, 7, 8, 10, 12 and 14). Our interviews included stories of missed cases of new-onset frailty and dementia in elderly patients assessed by telephone.

In most practices we studied, most long-term condition management was undertaken by telephone. This may be appropriate (and indeed welcome) when the patient is well and confident and a physical examination is not needed. But diabetes reviews, for example, require foot examination. Case 7 describes the deterioration and death of a patient with diabetes whose routine check-ups had been entirely by telephone. We also heard stories of delayed diagnosis of new diabetes in children when an initial telephone assessment failed to pick up lethargy, weight loss and smell of ketones, and point-of-care tests of blood or urine were not possible.

Nurses observed that remote consultations limit opportunities for demonstrating or checking the patient’s technique in using a device for monitoring or treating their condition such as an inhaler, oximeter or blood pressure machine.

Safety netting was inadequate in many remote safety incidents, even when provided by a clinician (cases 2, 5, 6, 8, 10, 12 and 13) but especially when conveyed by a non-clinician (case 15). Expert interviewees identified that making life-changing diagnoses remotely and starting patients on long-term medication without an in-person appointment was also risky.

Our ethnographic data showed that various measures were used to compensate for limited clinical information, including converting a phone consultation to video (case 16), asking the patient if they felt they could wait until an in-person slot was available (case 18), visiting the patient at home (case 20) and enacting a ‘if the history doesn’t make sense, bring the patient in for an in-person assessment’ rule of thumb (case 21). Out-of-hours providers added examples of rules of thumb that their services had developed over years of providing remote services, including ‘see a child face-to-face if the parent rings back’, ‘be cautious about third-party histories’, ‘visit a palliative care patient before starting a syringe driver’ and ‘do not assess abdominal pain remotely’.

Remote modalities place additional burdens on patients and carers

Given the greater importance of the history in remote consultations, patients who lacked the ability to communicate and respond in line with clinicians’ expectations were at a significant disadvantage. Several safety incidents were linked to patients’ limited fluency in the language and culture of the clinician or to specific vulnerabilities such as learning disability, cognitive impairment, hearing impairment or neurodiversity. Those with complex medical histories and comorbidities, and those with inadequate technical set-up and skills (case 3), faced additional challenges.

In many practices, in-person appointments were strictly limited according to more or less rigid triage criteria. Some patients were unable to answer the question ‘is this an emergency?’ correctly, leading to their condition being deprioritised (case 15). Some had learnt to ‘game’ the triage system (eg, online templates 29 ) by adapting their story to obtain the in-person appointment they felt they needed. This could create distrust and lead to inaccurate information on the patient record.

Our ethnographic dataset contained many examples of clinical and support staff using initiative to compensate for vulnerable patients’ inability or unwillingness to take on the additional burden of remote modalities (cases 19 and 20 in Box 2 30 31 ).

Training for remote encounters is often inadequate

Safety incidents highlighted various training needs for support staff members (eg, customer care skills, risks of making clinical judgements) and clinicians (eg, limitations of different modalities, risks of diagnostic overshadowing). Whereas out-of-hours providers gave thorough training to novice GPs (covering such things as attentiveness, rapport building, history taking, probing, attending to contextual cues and safety netting) in telephone consultations, 32–34 many in-hours clinicians had never been formally taught to consult by telephone. Case 17 illustrates how on-the-job training based on acknowledgement of contextual pressures and judicious use of rules of thumb may be very effective in averting safety incidents.

Statement of principal findings

An important overall finding from this study is that examples of deaths or serious harms associated with remote encounters in primary care were extremely rare, amounting to fewer than 100 despite an extensive search going back several years.

Analysis of these 95 safety incidents, drawn from multiple complementary sources, along with rich qualitative data from ethnography, interviews and workshops has clarified where the key risks lie in remote primary care. Remote triage and consultations expanded rapidly in the context of the COVID-19 crisis; they were occurring in the context of resource constraints, understaffing and high demand. Triage and care pathways were complex, multilayered and hard to navigate; some involved distributed work among multiple clinical and non-clinical staff. In some cases, multiple remote encounters preceded (and delayed) a needed in-person assessment.

In this high-risk context, safety incidents involving death or serious harm were rare, but those that occurred were characterised by a combination of inappropriate choice of modality, poor rapport building, inadequate information gathering, limited clinical assessment, inappropriate clinical pathway (eg, wrong algorithm) and failure to take account of social circumstances. These led to missed, inaccurate or delayed diagnoses, underestimation of severity or urgency, delayed referral, incorrect or delayed treatment, poor safety netting and inadequate follow-up. Patients with complex or multiple pre-existing conditions, cardiac or abdominal emergencies, vague or generalised symptoms, safeguarding issues and failure to respond to previous treatment, and those who (for any reason) had difficulty communicating, seemed particularly at risk.

Strengths and limitations of the study

The main strength of this study was that it combined the largest Safety I study undertaken to date of safety incidents in remote primary care (using datasets which have not previously been tapped for research), with a large, UK-wide ethnographic Safety II analysis of general practice as well as stakeholder interviews and workshops. Limitations of the safety incident sample (see final column in table 1 ) include that it was skewed towards very rare cases of death and serious harm, with relatively few opportunities for learning that did not result in serious harm. Most sources were retrospective and may have suffered from biases in documentation and recall. We also failed to obtain examples of safeguarding incidents (which would likely turn up in social care audits). While all cases involved a remote modality (or a patient who would not or could not use one), it is impossible to definitively attribute the harm to that modality.

Comparison with existing literature

This study has affirmed previous findings that processes, workflows and training in in-hours general practice have not adapted adequately to the booking, delivery and follow-up of remote consultations. 24 35 36 Safety issues can arise, for example, from how the remote consultation interfaces with other key practice routines (eg, for making urgent referrals for possible cancer). The sheer complexity and fragmentation of much remote and digital work underscores the findings from a systematic review of the importance of relational coordination (defined as ‘a mutually reinforcing process of communicating and relating for the purpose of task integration ’ (p 3) 37 ) and psychological safety (defined as ‘people’s perceptions of the consequences of taking interpersonal risks in a particular context such as a workplace ’ (p 23) 38 ) in building organisational resilience and assuring safety.

The additional workload and complexity associated with running remote appointments alongside in-person ones is cognitively demanding for staff and requires additional skills for which not all are adequately trained. 24 39 40 We have written separately about the loss of traditional continuity of care as primary care services become digitised, 41–43 and about the unmet training needs of both clinical and support staff for managing remote and digital encounters. 24

Our findings also resonate with research showing that remote modalities can interfere with communicative tasks such as rapport building, establishing a therapeutic relationship and identifying non-verbal cues such as tearfulness 35 36 44 ; that remote consultations tend to be shorter and feature less discussion, information gathering and safety netting 45–48 ; and that clinical assessment in remote encounters may be challenging, 27 49 50 especially when physical examination is needed. 35 36 51 These factors may rarely contribute to incorrect or delayed diagnoses, underestimation of the seriousness or urgency of a case, and failure to identify a deteriorating trajectory. 35 36 52–54

Even when systems seem adequate, patients may struggle to navigate them. 23 30 31 This finding aligns with an important recent review of cognitive load theory in the context of remote and digital health services: because such services are more cognitively demanding for patients, they may widen inequities of access. 55 Some patients lack navigating and negotiating skills, access to key technologies 13 36 or confidence in using them. 30 35 The remote encounter may require the patient to have a sophisticated understanding of access and cross-referral pathways, interpret their own symptoms (including making judgements about severity and urgency), obtain and use self-monitoring technologies (such as a blood pressure machine or oximeter) and convey these data in medically meaningful ways (eg, by completing algorithmic triage forms or via a telephone conversation). 30 56 Furthermore, the remote environment may afford fewer opportunities for holistically evaluating, supporting or safeguarding the vulnerable patient, leading to widening inequities. 13 35 57 Previous work has also shown that patients with pre-existing illness, complex comorbidities or high-risk states, 58 59 language non-concordance, 13 35 inability to describe their symptoms (eg, due to autism 60 ), extremes of age 61 and those with low health or system literacy 30 are more difficult to assess remotely.

Lessons for safer care

Many of the contributory factors to safety incidents in remote encounters have been suggested previously, 35 36 and align broadly with factors that explain safety incidents more generally. 53 62 63 This new study has systematically traced how upstream factors may, very rarely, combine to contribute to avoidable human tragedies—and also how primary care teams develop local safety practices and cultures to help avoid them. Our study provides some important messages for practices and policymakers.

First, remote encounters in general practice are mostly occurring in a system designed for in-person encounters, so processes and workflows may work less well.

Second, because the remote encounter depends more on history taking and dialogue, verbal communication is even more mission critical. Working remotely under system pressures and optimising verbal communication should both be priorities for staff training.

Third, the remote environment may increase existing inequities as patients’ various vulnerabilities (eg, extremes of age, poverty, language and literacy barriers, comorbidities) make remote communication and assessment more difficult. Our study has revealed impressive efforts from staff to overcome these inequities on an individual basis; some of these workarounds may become normalised and increase efficiency, but others are labour intensive and not scalable.

A final message from this study is that clinical assessment provides less information when a physical examination (and even a basic visual overview) is not possible. Hence, the remote consultation has a higher degree of inherent uncertainty. Even when processes have been optimised (eg, using high-quality triage to allocate modality), but especially when they have not, diagnoses and assessments of severity or urgency should be treated as more provisional and revisited accordingly. We have given examples in the Results section of how local adaptation and rule breaking bring flexibility into the system and may become normalised over time, leading to the creation of locally understood ‘rules of thumb’ which increase safety.

Overall, these findings underscore the need to share learning and develop guidance about the drivers of risk, how these play out in different kinds of remote encounters and how to develop and strengthen Safety II approaches to mitigate those risks. Table 2 shows proposed mitigations at staff, process and system levels, as well as a preliminary list of suggestions for patients, which could be refined with patient input using codesign methods. 64

Reducing safety incidents in remote primary care

Unanswered questions and future research

This study has helped explain where the key risks lie in remote primary care encounters, which in our dataset were almost all by telephone. It has revealed examples of how front-line staff create and maintain a safety culture, thereby helping to prevent such incidents. We suggest four key avenues for further research. First, additional ethnographic studies in general practice might extend these findings and focus on specific subquestions (eg, how practices identify, capture and learn from near-miss incidents). Second, ethnographic studies of out-of-hours services, which are mostly telephone by default, may reveal additional elements of safety culture from which in-hours general practice could learn. Third, the rise in asynchronous e-consultations (in which patients complete an online template and receive a response by email) raises questions about the safety of this new modality which could be explored in mixed-methods studies including quantitative analysis of what kinds of conditions these consultations cover and qualitative analysis of the content and dynamics of the interaction. Finally, our findings suggest that the safety of new clinically related ‘assistant’ roles in general practice should be urgently evaluated, especially when such staff are undertaking remote assessment or remote triage.

Ethics statements

Patient consent for publication.

Not applicable.

Ethics approval

Ethical approval was granted by the East Midlands—Leicester South Research Ethics Committee and UK Health Research Authority (September 2021, 21/EM/0170 and subsequent amendments). Access to the NHS Resolution dataset was obtained by secondment of the RP via honorary employment contract, where she worked with staff to de-identify and fictionalise relevant cases. The Remote by Default 2 study (referenced in main text) was co-designed by patients and lay people; it includes a diverse patient panel. Oversight was provided by an independent external advisory group with a lay chair and patient representation. A person with lived experience of a healthcare safety incident (NS) is a co-author on this paper and provided input to data analysis and writing up, especially the recommendations for patients in table 2 .

Acknowledgments

We thank the participating organisations for cooperating with this study and giving permission to use fictionalised safety incidents. We thank the participants in the ethnographic study (patients, practice staff, policymakers, other informants) who gave generously of their time and members of the study advisory group.

  • Sarbadhikari SN ,
  • Jacob AG , et al
  • Hall Dykgraaf S ,
  • Desborough J ,
  • de Toca L , et al
  • Koonin LM ,
  • Tsang CA , et al
  • England NHS
  • Papoutsi C ,
  • Greenhalgh T
  • ↵ Healthcare safety investigation branch: NHS 111’s response to callers with COVID-19-related symptoms during the pandemic . 2022 . Available : https://www.hsib.org.uk/investigations-and-reports/response-of-nhs-111-to-the-covid-19-pandemic/nhs-111s-response-to-callers-with-covid-19-related-symptoms-during-the-pandemic [Accessed 25 Jun 2023 ].
  • Royal College of General Practitioners
  • NHS Confederation
  • Gupta PP , et al
  • Panagioti M ,
  • Keers RN , et al
  • Panesar SS ,
  • deSilva D ,
  • Carson-Stevens A , et al
  • Campbell JL ,
  • Britten N ,
  • Green C , et al
  • Huibers L , et al
  • Hollnagel E ,
  • Braithwaite J
  • Institute of Medicine (US)
  • Jerak-Zuiderent S
  • Greenhalgh T ,
  • Alvarez Nishio A , et al
  • Hemmings N , et al
  • Hughes G , et al
  • Salisbury C , et al
  • Berens E-M ,
  • Nowak P , et al
  • Macdonald S ,
  • Browne S , et al
  • Edwards PJ ,
  • Bennett-Britton I ,
  • Ridd MJ , et al
  • Warren C , et al
  • Challiner J , et al
  • Wieringa S ,
  • Greenhalgh T , et al
  • Rushforth A , et al
  • Edmondson AC ,
  • Eddison N ,
  • Healy A , et al
  • Paparini S , et al
  • Byng R , et al
  • Moore L , et al
  • McKinstry B ,
  • Hammersley V ,
  • Burton C , et al
  • Gafaranga J ,
  • McKinstry B
  • Seuren LM ,
  • Wherton J , et al
  • Donaghy E ,
  • Parker R , et al
  • Sharma SC ,
  • Thakker A , et al
  • Johnsen TM ,
  • Norberg BL ,
  • Kristiansen E , et al
  • Wherton J ,
  • Ferwerda R ,
  • Tijssen R , et al
  • NHS Resolution
  • Marincowitz C ,
  • Bath P , et al
  • Antonio MG ,
  • Williamson A ,
  • Kameswaran V , et al
  • Oudshoorn N
  • Winters D ,
  • Newman T , et al
  • Huibers L ,
  • Renaud V , et al
  • ↵ Remote consultations . n.d. Available : https://www.gmc-uk.org/ethical-guidance/ethical-hub/remote-consultations
  • Doherty M ,
  • Neilson S ,
  • O’Sullivan J , et al
  • Carson-Stevens A ,
  • Hibbert P ,
  • Williams H , et al
  • Edwards A ,
  • Powell C , et al
  • Morris RL ,
  • Fraser SW ,

X @dakinfrancesca, @trishgreenhalgh

Contributors RP led the Safety I analysis with support from AC. The Safety II analysis was part of a wider ethnographic study led by TG and SS, on which all other authors undertook fieldwork and contributed data. TG and RP wrote the paper, with all other authors contributing refinements. All authors checked and approved the final manuscript. RP is guarantor.

Funding Funding was from NIHR HS&DR (grant number 132807) (Remote by Default 2 study) and NIHR School for Primary Care Research (grant number 594) (ModCons study), plus an NIHR In-Practice Fellowship for RP.

Competing interests RP was National Professional Advisor, Care Quality Commission 2017–2022, where her role included investigation of safety issues.

Provenance and peer review Not commissioned; externally peer reviewed.

Linked Articles

  • Editorial Examining telehealth through the Institute of Medicine quality domains: unanswered questions and research agenda Timothy C Guetterman Lorraine R Buis BMJ Quality & Safety 2024; 33 552-555 Published Online First: 09 May 2024. doi: 10.1136/bmjqs-2023-016872

Read the full text or download the PDF:

IMAGES

  1. Methods of qualitative data analysis.

    method of data analysis in qualitative research

  2. Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic

    method of data analysis in qualitative research

  3. Methods Of Qualitative Data Analysis

    method of data analysis in qualitative research

  4. What Is A Qualitative Data Analysis And What Are The Steps Involved In

    method of data analysis in qualitative research

  5. Qualitative Data Analysis stock illustration. Illustration of

    method of data analysis in qualitative research

  6. The process of Qualitative Data Analysis includes analysing information

    method of data analysis in qualitative research

COMMENTS

  1. Qualitative Data Analysis: What is it, Methods + Examples

    Qualitative data analysis is a systematic process of examining non-numerical data to extract meaning, patterns, and insights. In contrast to quantitative analysis, which focuses on numbers and statistical metrics, the qualitative study focuses on the qualitative aspects of data, such as text, images, audio, and videos.

  2. Qualitative Data Analysis Methods: Top 6 + Examples

    QDA Method #1: Qualitative Content Analysis. Content analysis is possibly the most common and straightforward QDA method. At the simplest level, content analysis is used to evaluate patterns within a piece of content (for example, words, phrases or images) or across multiple pieces of content or sources of communication. For example, a collection of newspaper articles or political speeches.

  3. Learning to Do Qualitative Data Analysis: A Starting Point

    Jessica Nina Lester is an associate professor of Counseling and Educational Psychology at Indiana University. She received her PhD from the University of Tennessee, Knoxville. Her research strand focuses on the study and development of qualitative research methodologies and methods at a theoretical, conceptual, and technical level.

  4. Qualitative Data Analysis: Step-by-Step Guide (Manual vs ...

    Qualitative Data Analysis methods. Once all the data has been captured, there are a variety of analysis techniques available and the choice is determined by your specific research objectives and the kind of data you've gathered. Common qualitative data analysis methods include: Content Analysis. This is a popular approach to qualitative data ...

  5. PDF The SAGE Handbook of Qualitative Data Analysis

    The SAGE Handbook of. tive Data AnalysisUwe FlickMapping the FieldData analys. s is the central step in qualitative research. Whatever the data are, it is their analysis that, in a de. isive way, forms the outcomes of the research. Sometimes, data collection is limited to recording and docu-menting naturally occurring ph.

  6. (PDF) Data Analysis Methods for Qualitative Research: Managing the

    Although researchers in qualitative research can choose different coding methods based on various data types, research purposes, and research processes, these coding methods often present a series ...

  7. What Is Qualitative Research?

    Qualitative research methods. Each of the research approaches involve using one or more data collection methods.These are some of the most common qualitative methods: Observations: recording what you have seen, heard, or encountered in detailed field notes. Interviews: personally asking people questions in one-on-one conversations. Focus groups: asking questions and generating discussion among ...

  8. Qualitative Data Analysis

    Qualitative data analysis is an important part of research and building greater understanding across fields for a number of reasons. First, cases for qualitative data analysis can be selected purposefully according to whether they typify certain characteristics or contextual locations. In other words, qualitative data permits deep immersion into a topic, phenomenon, or area of interest.

  9. The Primary Methods of Qualitative Data Analysis

    Steps in Qualitative Data Analysis Data Collection. Data collection is the initial phase of qualitative research and data analysis. It involves selecting appropriate methods to gather data such as interviews, observations, focus groups, or archival research. Researchers may employ various techniques to collect data.

  10. How to use and assess qualitative research methods

    How to conduct qualitative research? Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [13, 14].As Fossey puts it: "sampling, data collection, analysis and interpretation are related to each other in a cyclical ...

  11. Qualitative Data Analysis Methods: A Comprehensive Guide

    Understanding diverse qualitative data analysis methods is essential for researchers aiming to uncover patterns in human behavior and attitudes. As methodologies evolve, it becomes imperative to adapt and equip oneself with appropriate tools and frameworks for effective analysis. ... Ethnography is a qualitative research method that emphasizes ...

  12. What is Qualitative Data Analysis? Definition, Types, Methods, Examples

    Several qualitative data analysis methods exist, each tailored to uncover specific insights from non-numerical data. Here are some prominent types: 1. Grounded Theory: ... Researchers often choose a method based on the nature of their research questions, the type of data collected, and their epistemological and ontological perspectives. ...

  13. Introduction to qualitative research methods

    INTRODUCTION. Qualitative research methods refer to techniques of investigation that rely on nonstatistical and nonnumerical methods of data collection, analysis, and evidence production. Qualitative research techniques provide a lens for learning about nonquantifiable phenomena such as people's experiences, languages, histories, and cultures.

  14. Qualitative Research

    Qualitative Research. Qualitative research is a type of research methodology that focuses on exploring and understanding people's beliefs, attitudes, behaviors, and experiences through the collection and analysis of non-numerical data. It seeks to answer research questions through the examination of subjective data, such as interviews, focus ...

  15. Qualitative Data Analysis

    5. Grounded theory. This method of qualitative data analysis starts with an analysis of a single case to formulate a theory. Then, additional cases are examined to see if they contribute to the theory. Qualitative data analysis can be conducted through the following three steps: Step 1: Developing and Applying Codes.

  16. (PDF) Qualitative Data Analysis and Interpretation: Systematic Search

    Qualitative data analysis is. concerned with transforming raw data by searching, evaluating, recogni sing, cod ing, mapping, exploring and describing patterns, trends, themes an d categories in ...

  17. 5 Qualitative Data Analysis Methods to Reveal User Insights

    5 qualitative data analysis methods explained. Qualitative data analysis is the process of organizing, analyzing, and interpreting qualitative research data—non-numeric, conceptual information, and user feedback—to capture themes and patterns, answer research questions, and identify actions to improve your product or website.Step 1 in the research process (after planning) is qualitative ...

  18. PDF 12 Qualitative Data, Analysis, and Design

    analysis process, as it does in the design and data collection phase. Qualitative research methods are not "routinized", meaning there are many different ways to think about qualitative research and the creative approaches that can be used. Good qualitative research contributes to science via a

  19. Qualitative Research in Healthcare: Data Analysis

    The 6-step content analysis research process proposed by Krippendorff [ 66] is as follows: Step 1, unitizing, is a process in which the researcher selects a scheme for classifying the data of interest for data collection and analysis. Step 2, sampling, involves selecting a conceptually representative sample population.

  20. Qualitative Data Analysis Strategies

    This chapter provides an overview of selected qualitative data analysis strategies with a particular focus on codes and coding. Preparatory strategies for a qualitative research study and data management are first outlined. Six coding methods are then profiled using comparable interview data: process coding, in vivo coding, descriptive coding ...

  21. Top qualitative research data analysis methods explained

    Grounded Theory is a qualitative research method that focuses on generating theories from data rather than testing existing theories. It is widely used in social sciences, allowing researchers to develop insights based on systematic data analysis.

  22. Qualitative Data

    Analyze data: Analyze the data using appropriate qualitative data analysis methods, such as thematic analysis or content analysis. Interpret findings: Interpret the findings of the data analysis in the context of the research question and the relevant literature. This may involve developing new theories or frameworks, or validating existing ...

  23. Decolonising Qualitative Analysis: Collectively Weaving Understanding

    Using qualitative data analysis methods embedded within a Pacific-Indigenous research paradigm decolonises research. This article discusses the Pacific-Indigenous data analysis processes of talanoa and fa'afaletui employed within a study of Pacific elder care in Aotearoa New Zealand, conducted by scholars of Pacific/Moana heritage.

  24. Qualitative vs. Quantitative Data Analysis in Education

    Qualitative data analysis methods. Analyzing qualitative data takes a number of steps. When you first get all your data in one place you can do a review and take notes of trends you think you're seeing or your initial reactions. Next, you'll want to organize all the qualitative data you've collected by assigning it categories.

  25. Qualitative Research: Data Collection, Analysis, and Management

    There are many ways of conducting qualitative research, and this paper has covered some of the practical issues regarding data collection, analysis, and management. Further reading around the subject will be essential to truly understand this method of accessing peoples' thoughts and feelings to enable researchers to tell participants' stories.

  26. Factors influencing fidelity to guideline implementation strategies for

    Design. The intervention, methods and results of the Stop Cancer PAIN trial have been described in previous open-access articles [17, 18].The sub-study used a qualitative approach with pragmatic orientation to enable in-depth exploration of factors influencing success from the perspectives of clinicians at each participating centre [].Clinician views canvassed at interview were considered the ...

  27. Updating a conceptual model of effective symptom management in

    A conceptual model of effective symptom management was previously developed from qualitative data derived from interviews with healthcare professionals working in English hospices to elicit their views about the barriers and facilitators of effective symptom management [].The model delineated the successful symptom management experience into four steps of: engagement, decision-making ...

  28. Progress in Remote Sensing and GIS-Based FDI Research Based on ...

    In recent studies, scholars have employed a range of research methods, integrated various data sources, ... In order to conduct a comprehensive review, it is essential to integrate quantitative analysis with qualitative methods. One such method is the mixed application of scientometric methods with qualitative literature reviews.

  29. Patient safety in remote primary care encounters: multimethod

    Methods Multimethod qualitative study. We explored causes of real safety incidents retrospectively ('Safety I' analysis). In a prospective longitudinal study, we used interviews and ethnographic observation to produce individual, organisational and system-level explanations for why safety and near-miss incidents (rarely) occurred and why they did not occur more often ('Safety II ...