US Government strengthens international aid model with qualitative data analysis

USAID uses NVivo to improve learning agenda

Background

The United States Agency of International Development (USAID) is a U.S. government agency that betters the lives of millions across the globe through areas such as improving agricultural productivity, combating maternal and child mortality, providing immediate disaster relief, fostering private sector development, elevating the role of women and girls, and providing humanitarian support.[i] Since its inception during the Kennedy administration, USAID has made an impact in countries across the world.
Funded by the USAID Bureau for Policy, Planning and Learning (PPL), the USAID LEARN contract —implemented by Dexis Consulting Group—is designed to support strategic learning and knowledge management at USAID to improve the performance and effectiveness of its programs around the world.[ii]
USAID LEARN works with multiple teams and departments within USAID through an approach called, “collaborating, learning and adapting (CLA).” Part of this approach includes using Learning Agendas, which are a set of agency-specific questions directly related to its work. When these questions are answered, the Learning Agenda guides staff  through organizational learning in order to create an environment of growth and success.

The Challenge

USAID LEARN conducted 60 interviews and held two focus groups to properly analyze the Learning Agenda’s current landscape within federal agencies. They reviewed brochures and programs already informed by Learning Agendas across agencies, including The Department of Housing and Urban Development (HUD), The Department of Agriculture (USDA), The Department of Labor (DOL), The United Nations Research Institute for Social Development (UNRISD).[iii]
The analysis accumulated textual, video, audio, PDF or image and numerical data. However, traditional quantitative data analysis tools are only able to analyze the numerical data collected. And the traditional, pen/paper/highlighter/sticky note method of analysis would take too long. Individually reviewing each image, interview, and document would waste hours of time and risk the loss of important themes or ideas as time passes. USAID LEARN and Dexis Consulting Group needed a different solution.

Introducing NVivo

USAID LEARN conducted an assessment of the potential software that can analyze all of their qualitative (non-numerical) data in a quick and meaningful way. Because NVivo is often recognized as the industry standard for qualitative data analytics in education, many of the analysts had previous, positive experience working with the solution. NVivo’s functionality, accuracy and speed are exactly what the team was seeking, so it was the unanimous choice.

Results

The USAID LEARN team was impressed to learn from their NVivo research that the Learning Agenda, serving as an integral way to understand both how and why to grow, was in use far more widely than anticipated.

“We discovered that there are over 20 operating units within USAID using Learning Agendas. A variety of people, from HUD to the UN, in all different technical areas are also using them to learn and grow,” said Matt Baker, USAID LEARN’s Monitoring Evaluation, Research and Learning Specialist.

NVivo also helped the USAID LEARN team uncover the themes that make up their Learning Agenda questions.

“Because NVivo can interpret live language, it was able to read over 100 questions and show us how we could distill them down to the best five,” said Baker,

NVivo revealed the top reasons why Learning Agendas are useful by identifying three primary motivations for those who use them:

  1. Expectations of accountability, especially in response to leadership demands
  2. Leadership transitions and structural, strategic, or policy changes
  3. Responses to identified program-related needs

Many other discoveries were made in the landscape analysis that will lead the USAID LEARN contract as the team begins the one-to-two-year process of developing strong Learning Agendas for USAID. Because of this, USAID has the opportunity to improve the effectiveness of its important work, both in Washington and across the globe.

[i] https://www.usaid.gov/what-we-do
[ii] https://usaidlearninglab.org/learn-contract
[iii] https://usaidlearninglab.org/sites/default/files/resource/files/landscape_analysis_report_04_10_17.pdf
 

ABOUT THE AUTHOR

QSR International with USAID and Dexis Consulting Group

The United States Agency of International Development (USAID) is a U.S. government agency that betters the lives of millions across the globe through areas such as improving agricultural productivity, combating maternal and child mortality, providing immediate disaster relief, fostering private sector development, elevating the role of women and girls, and providing humanitarian support. Every day, QSR International helps 1.5 million researchers, marketers and others to utilize Qualitative Data Analysis (QDA) to uncover deeper insights contained within the “human data” collected via social media, consumer and community feedback and other means. We give people the power to make better decisions by uncovering more insights to advance their area of exploration.

Analysing large survey data using automated insights

NVivo Plus enables quick identification of themes and sentiment in University wide student feedback.

An impossible task?

The National Student Survey (NSS) is an annual survey which gathers the opinions, and experiences, of final-year undergraduate students on the course that they have studied. It is widely recognised as an authoritative and highly influential survey in building a picture of the quality of life in higher education in the UK.

At Lancaster University, as with many other institutions, there had to date been no systematic analysis of the qualitative element of the survey performed. The challenge was to develop a better and deeper understanding from the data collected, to improve the student experience of living and studying at Lancaster.

This was taken on as a pilot project by Steve Wright, Ph.D., a Learning Technologist in the Faculty of Health and Medicine at Lancaster University.

Finding the right tool for the task

The NSS is very important to Lancaster, as the score that is achieved has a huge influence on the current standing of the University.

Lancaster was awarded Gold in the Teaching Excellence Framework (TEF) following an outstanding NSS score (84.3% positive score) in 2016. It was also recently named University of the Year by the Times and the Sunday Times Good University Guide 2018.

The aim of the NSS pilot project was to feed into a broad spectrum of institutional activity in preparation for the University’s TEF submission and evaluation.

The first phase looked at a comparison of three possible tools to use for the qualitative analysis:

NVivo was chosen because:

NVivo’s ability to present data visually was also an important factor in the selection process. “In NVivo the visualisations enable you to explore the data which makes it a very powerful tool for both analysis and presentation,” said Steve. Rather than put a bar chart in a report, he presents directly from within NVivo. “The ability to use it as an interactive presentation is what makes it so powerful, for example, to be able to click on a column in the histogram and get the underlying data,” he said.

Testing the automation of sentiment and themes

Lancaster University received 8000 NSS comments, amounting to 25,000 words to be analysed, across the institution.

The NSS asked students to complete three open-ended comments:

Steve was interested in finding out what insights could be gleaned from the data, particularly if they used a systematic approach that could be replicated. The approach would pull out key topics, group those key topics and then explore the sentiment related to those topics. The idea being that as the sentiment is there in the NSS structure (asking for a positive and a negative comment), it is possible to check the accuracy of NVivo’s sentiment analysis (negative/positive) against those and then extrapolate from that, or use it as an example of the confidence you can have in NVivo’s automated sentiment tagging for other datasets.

“We have loads of text but mostly what happens is, we share it with no analysis, and only basic structure with the key people in a department for them to read through. And the easiest thing that happens, when they receive three or four pages of comments, is to read the first few and construct a narrative in their head and immediately get information bias,” said Steve.

“NVivo Plus tools were really good for this. I was able to take this minimally structured data – which only gave the department it related to and the type of a comment and then to extract topics and cross-reference those with the sentiment, as well as provide summaries,” he said.

Benefits of using NVivo

The analysis was well received however it received one critical response, which contended his analysis did not show anything further than the previous statistical analysis. Steve argued that was not the case. “The statistics show that students are broadly happy here. They like certain things but specific areas are shown by the quantitative statistics as being lower, however, what the statistics don’t do, is give any real insights as to the processes, the experiences that inform those lower scores.”

“What the qualitative analysis allows you to do, is to pull out those topics and break them apart to see why some departments had a higher score – to identify good practice, and some of the specific reasons given where there were lower scores to inform interventions and development, for example,” Steve said.
NVivo’s sentiment analysis capabilities played an important part in the data analysis, particularly given the way survey data is collected for the NSS.

“Because of the structure of the NSS, of positive comment and negative comment, we were able to cross-tabulate that with NVivo’s sentiment analysis and get a kind of built-in check of accuracy,” said Steve. “And it was very high. It tends to get things wrong by addition, not omission. i.e. it will classify something as both positive and negative when it’s just positive. The classic instance being ‘I had a load of personal problems and the department was fantastic.’ That is a positive comment, but, because it has the word ‘problems’ in it, it is automatically classified as negative as well.

Overwhelmingly NVivo classifies it correctly, we know that there will be some false matches but they’re a minority and given the volume of data it enables us to work with they can be accounted for,” he said.  “What’s more, this gives us a baseline for being confident in the sentiment analysis when we apply this approach to other student feedback and comment data without this structure.”

Being able to share the project across the University with other staff who are familiar with NVivo is an ambition for the future, as opposed to sharing a static report. Staff can delve straight into the project and discover insights for themselves.

Future Work

The University is planning to repeat the analysis next year, and build upon the framework.

From those who have seen it, there has been some real interest. “I think the real potential is with student or staff surveys. Most organisations have staff surveys, and they ask for extensive qualitative comments and usually, don’t do any sort of systematic analysis with them,” said Steve.

The point of the project was to develop a method, and NVivo assisted with a better analysis of this data. “The questions were:

And I really think the support NVivo provided has a real potential for other sectors with those practical priorities for working with qualitative data rather than the software being part of the somewhat arcane, and highly theoretical, pursuits of qualitative analysis within academia,” said Steve.

He also suggests that there’s a significant opportunity for commercial and public-sector organisations who need to work with unstructured datasets for analysing customer experience, and with a lot of potential for further development of methods and approaches like those introduced here.

About Steve Wright

Earning his Ph.D. in E-Research and Technology Enhanced Learning in 2014, Steve Wright works as a Learning Technologist in the Faculty of Health and Medicine at Lancaster University in the UK. He is also an independent CAQDAS trainer, consultant and certified NVivo expert. As a researcher, he completed five small-scale research projects, in addition to his Ph.D. thesis on sensory learning with a focus on craft beer, with which he took an ethnographic approach.

As an academic-related professional, he’s particularly interested in the e-research area and discovering what is possible for digital tools and how they’ll influence new approaches, which remains his focus. He also has an interest in the development, research, and teaching of methods. Steve’s consultancy and training work is through www.caqdas.co.uk    

About QSR International

Every day, QSR International helps 1.5 million researchers, marketers and others to utilize Qualitative Data Analysis (QDA) to uncover deeper insights contained within the “human data” collected via social media, consumer and community feedback and other means. We give people the power to make better decisions by uncovering more insights to advance their area of exploration.

NVivo in mixed-methods research

NVivo has a long history of assisting qualitative researchers with managing, and analyzing their data, and being a part of complex research. What you might be interested to know, is that NVivo is also being used by mixed methods researchers to help get a wider picture of the data they’re examining.   

Background

Melody Goodman, PhD is a biostatistician and an Associate Professor in the Department of Biostatistics, at New York University Global Public Health. Her work is focused on racial disparities in urban settings, and she is particularly interested in examining how place is a determinant of disparity, and looking at the concept of how ‘where you live, work, play and pray’ impacts your health outcomes. She works in mixed methods research, and as a biostatistician, has a traditionally quantitative view. Much of Melody’s research has been on health and research literacy and how people understand and use health information.

Speaking about the impact of her research, Melody said “My attempt at working towards addressing disparities has been to increase the knowledge level of the receiver of the information. My work spans a wide spectrum from the generation of new statistical methods all the way out to community engaged research, including a program where we train community members on research methods. I don’t specialize in any particular disease area or any statistical methodological area.”

Introduction to NVivo

Melody is currently using NVivo to validate a quantitative survey tool that assesses the level of community partner engagement in research. When her team were required to evaluate their work, they found that there were no existing validated tools, and they would be required to create their own. “We thought that we could find some existing measure and use that to evaluate our programs. But when we went to the literature there really wasn’t any measures that assess how engaged people felt they were in research processes and research centers. So, we decided to develop our own measure, which was really trying to evaluate a big comprehensive research center that had multiple research projects” Melody said.

As the team were developing a brand new measure, and there was no “gold standard” to validate by, they found the qualitative work they undertook was important for a number of reasons; “We were trying to evaluate from the community health stakeholder’s point of view, instead of the academic’s perspective, and considering what the benefit is for a community member for participating in our research and how engaged they feel in the process” Melody said.

Development of a survey tool requires use of mixed-methods and the team uses NVivo to analyze qualitative data. However, they are also using the software as a project management tool, as Melody noted “We had so many different survey data sets and there’s a lot of rounds, with multiple surveys of experts and participants, we really used it to see where we were, and what data we’d collected. Not only were we using it to analyze the qualitative data, we also used it to keep everything in one place for this project.”

Further work with NVivo

Melody is also using NVivo for another project. Melody read some work by sociologist Elijah Anderson who created the term “white space”, meaning spaces which excluded anyone who wasn’t white, either explicitly or implied. It was particularly timely, given the current social climate, and the history of St Louis (where this project originated), with segregation, white flight, suburbanization, and gentrification.

Melody looked at this existing research from Anderson through the lens of her own work, and her own social experiences. “It’s great work, and he had lots of ethnographical stories, but for me as a biostatistician I really want to measure it. As a black person, I could relate to it, but I didn’t feel like I could convince someone who wasn’t black that this idea really existed. If you didn’t have those life experiences, you may view ethnographic work as anecdotal stories” she said.

Melody was interested in creating a survey measure that would assess if a space was a “white space”. “The first thing we needed to do was talk to residents of St Louis, and it became clear that we needed to speak to both black and white residents, and all up we collected around 50 interviews” Melody explained.

As a biostatistician, Melody was particularly interested in gaining a full picture of the areas participants were mentioning in their interviews, to help inform the research further. “In the interviews, the participants talked about different cities, towns, and places such as shopping malls. NVivo was great because I could link census data which has the actual racial composition, and other factors such as percentage of poverty and median household income, of all the places they mentioned, so we could then code the data not only for the town, but also call upon the quantitative data that goes along with it. This is where NVivo showed what it’s really powerful for in mixed methods work” Melody said.

As a mixed methods researcher, Melody thinks of the quantitative data as the ‘what’, and qualitative data and the ‘why’. Having the ability to merge those two together and compare, for example, if someone reports in an interview that a space is predominately white, with available census data, and triangulate that using NVivo, has been an important part of this project.

Project outcomes

“Ultimately with this project, we want to create a quantitative survey tool that will allow others to assess whether a space is perceived to be a ‘white space’. Currently, we’re a long way off from that, but we had to start with the qualitative, asking people how they think about ‘white space’, how they talk about ‘white space’, how they define it, and if they even know what we’re talking about? It became so timely because of all the things that are going on in our country, and in this community in particular. We’ll probably get more from it than just the survey, because people really gave us so much information and were incredibly honest in their responses” Melody said. 

When it came to selecting software for this project, Melody’s previous experience with NVivo led her to choosing to work with it again. “I had difficulty understanding competitor software. As a traditionally quantitative person, with NVivo I could make sense of what my team were doing, and they were able to generate reports that I could understand” Melody said. 

“When I decided that I was going to learn a qualitative software package, I felt most comfortable attempting to learn NVivo. And it was a challenge to think like NVivo, mainly because I don’t think in qualitative terms, but I found the training to be immensely helpful. Once I did that, I could then go and play and learn more myself” she said.

As for the future of Melody’s mixed methods work, the outlook is positive. “I think I’m in a good space. There’s a need for researchers in mixed methods who can understand the quantitative, and the qualitative and go in and interrogate that data and triangulate the qualitative and quantitative findings” she said.

ABOUT THE AUTHOR

QSR International with Dr Melody Goodman, NYU

Dr. Goodman conducts applied biostatistical and survey research for community-based interventions and health disparities research with a strong focus on measurement. Additionally, through academic-community collaborations, she implements, evaluates, and enhances the infrastructure of community-engaged research, in order to mitigate health disparities. With numerous funders supporting her work, she has published over 70 peer-reviewed journal articles. Every day, QSR International helps 1.5 million researchers, marketers and others to utilize Qualitative Data Analysis (QDA) to uncover deeper insights contained within the “human data” collected via social media, consumer and community feedback and other means. We give people the power to make better decisions by uncovering more insights to advance their area of exploration.

From journalism to research and evaluation: a research journey

Dr. Anupama Shekar shares her story of following her passion for ensuring equitable access to quality education, no matter a child’s economic circumstance, and how it took her from a career in journalism in India, to post-doctoral research in the U.S. Her work has featured a long history of working with qualitative research and evaluation tools, including NVivo.

Introduction

Dr. Anupama Shekar, PhD, is a qualitative researcher and program evaluator with a passion for the field of educational research and evaluation. She is currently an Evaluation Consultant with the Center on Research and Evaluation at the Simmons School of Education and Human Development at the Southern Methodist University (SMU) in Dallas, Texas. Her prior experience includes working as the Director for Evaluation at Teaching Trust in Dallas, Texas, an education leadership non-profit organization. She also worked as an associate researcher, and prior to that a post-doctoral research associate with WIDA which is a national and international prek-12 language development and assessment program housed at the Wisconsin Center for Education Research at the University of Wisconsin-Madison.

Before undertaking her postdoctoral work, she earned her PhD at the Department of Educational Leadership and Policy Analysis at the University of Wisconsin-Madison. She also assisted in the development and evaluation of WIDA’s data literacy program known as LADDER for English language learners. Funded by the US Department of Education's Office of English Language Acquisition, this project essentially helped participating schools make data-driven decisions about English language learners.

Prior to coming to the U.S., Anupama received her Master's degree in journalism at the Symbiosis Institute of Mass Communication and worked as a print journalist for the New Indian Express, a national mainstream newspaper in Tamil Nadu, Southern India. In her role as a print journalist, she focused on public educational leadership and policy issues in South India, where her journey in education began. The school leaders she met served children from low-income families. They greatly impacted her and she was inspired to leave journalism and study educational leadership and policy.

A journey in education leadership and policy research

Anupama recalls why she felt compelled to change careers. “It was the initial encounter that I had with several children from low-income communities,” she said. “They really awakened my interest in studying education leadership and policy formally and improving the public school system in India.”

“Many years later, the first story I wrote for The New Indian Express in 2006, still remains on my desk, “ she said. “It continues to keep me focused on why I began this journey and the importance of working to improve the lives of children from low-income communities anywhere in the world.”

A 14-year-old girl said that she had to work to feed her mother and brothers, and could not go to school. That really stuck with Anupama. Although education is a fundamental right of children under the Indian constitution, thousands of underprivileged children still have no real access to a school or quality education. “At that point I started developing an interest in research and evaluation in education leadership. I wanted to study successful school leadership practices and leaders who advocate for children from low-income groups despite the odds,” Anupama said.

It was when Anupama’s doctorate studies and WIDA work began that NVivo came into the picture. Her professors and other researchers used it, and her own research involved writing up case studies of school leaders in public schools in Tamil Nadu, Southern India. Previous research in the U.S. had examined the contribution of parent involvement in children's educational outcomes, but very little was focused on the role of school principals in fostering parent, family and community involvement practices.

Her analysis of previous research led her to design an exploratory, qualitative, cross-case study and informed her research questions: how do public school leaders in Tamil Nadu foster parent and family involvement? And what are the similarities and differences across schools?

“I used NVivo 9 to explore the initial transcriptions of interviews, contextual observations and field notes. It gave me an initial understanding of all the data and how the school heads initiated and supported parent involvement practices at their schools,” said Anupama.

While NVivo helped gain an initial understanding of the themes in her data, Anupama also used a traditional and manual coding process while interrogating her qualitative data to unpack the complexities in her qualitative case studies.

“Manual coding helped me analyze the story of each headmaster and headmistress and see patterns. I needed to get close to the data to figure out the leaders actions more deeply,” she said. “I also used memos, and documents, and artifacts. I sort of let the curiosities as a researcher take over. I feel moving between manual and software coding really helped me with my dissertation analyses and to triangulate my own thinking and findings,” said Anupama.

She notes how innovative uses of qualitative data helped her accomplish a richer understanding of experiences in the case studies. “The main study findings were that the school heads’ over time created a continuum of overlapping actions that helped foster effective parent involvement. I was really able to get to the core of the school heads’ actions through usage of multiple analyses techniques and constant reflection on the qualitative data. As a qualitative researcher, you really commit to spending extended periods of time to get to the heart of the story” Anupama said.

During her work at WIDA during her doctorate studies, the WIDA’s LADDER project convened many focus groups, as well as individual interviews and mixed methods evaluation. “Each year we produced a program evaluation report and wrote up findings, so NVivo was useful as one of the tools that helped us identify themes and patterns,” said Anupama. “WIDA still offers the LADDER program, and I was there when they were developing the whole program from the ground up,” she said.

When Anupama moved onto her postdoctoral work, WIDA’s Teaching and Learning team were trying to understand best practices in professional learning and professional development. One large project involved multiple qualitative open-ended questions. Anupama found her prior experience helpful. “NVivo was a great tool for me to use then because we were working with a lot of diverse data and it ended up providing great insights,” she said.

Most recently she worked as the Director of Evaluation with Teaching Trust, an educational leadership non-profit in Dallas. Teaching Trust offers high quality training and support for future school leaders, school leadership teams, and teacher leaders to ensure that children in low-income schools across Texas have access to an excellent education.

“Teaching Trust has a strong alumni base and educators who graduated from Teaching Trust programs are out in the field driving positive change for students,” said Anupama. “The Teaching Trust Alumni Network team always gathered and used data effectively to drive their programmatic decisions. In this case, the team was trying to understand through qualitative data, the impact of the Teaching Trust alumni programming from the participant's point of view and how future programming might be improved and changed,” she said.

The Alumni Network team conducted qualitative focus groups of current and former participants. “After every focus group, our team met to extract meaning from the data — the impacts of Teaching Trust programming on participants, personal leadership, student and school outcomes, and what it really meant to be part of the Teaching Trust community,” said Anupama.

The team used both manual and software coding techniques with their qualitative data. “We took a grounded theory approach by listening and gathering data, and bridging perspectives to really unpack the themes and patterns” said Anupama.

“My former colleagues used pen and paper, and I used NVivo to code,” Anupama said. “There is a lot of power in combining multiple qualitative coding techniques because that adds to the validity and reduces researcher isolation. We presented the lessons learned and techniques on the collaborative qualitative approach in a webinar to the American Evaluation Association.” she said.

A passion for qualitative insights

Anupama’s career has evolved through her interest and passion for educational research and evaluation and ensuring people have equitable access to quality education, no matter their background or economic circumstance. Her appreciation for the importance of qualitative research and evaluation has been at the heart of her work.

“Qualitative data tells you something that numbers cannot, and helps you dig deeper to explore the complexities and find powerful insights,” she said. “As a qualitative researcher and evaluator, my challenge has been to find meaning in data, to keep asking why, and to continue digging,” said Anupama.

Anupama also hopes to continue sharing the power of qualitative research and evaluation through her website and blog in the near future. “There is a renewed energy in qualitative research and evaluation that is really exciting. There are people around the world who use qualitative data in very different ways in their work. I think it will be valuable to hear and share their stories as continual learning is the core of qualitative work.”

Next steps in career

Anupama hopes to use her learnings in qualitative research and evaluation at her current work at the Center on Research and Evaluation (CORE) at the Simmons School of Education and Human Development at the Southern Methodist University (SMU) in Dallas, Texas.

“I am excited to be doing projects for CORE and collaborating with their diverse and strong team of researchers and evaluators led by Dr. Annie Wright. They are at the forefront of conducting rigorous research and evaluation that focuses on examining critical issues around children, families and communities.

CORE is constantly striving to push boundaries and was selected as one of the Annie E. Casey Foundation’s expert evaluators nationwide. This shows the focus CORE has on issues around diversity, equity and social justice. I am honored to be learning as a researcher and evaluator with this incredible organization.”

You can follow CORE’s work on Facebook and Twitter.

ABOUT THE AUTHOR

QSR International with Dr Anupama Shekar

Dr. Anupama Shekar, PhD, is a qualitative researcher and program evaluator with a passion for the field of educational research and evaluation. She is currently an Evaluation Consultant with the Center on Research and Evaluation at the Simmons School of Education and Human Development at the Southern Methodist University (SMU) in Dallas, Texas. Her prior experience includes working as the Director for Evaluation at Teaching Trust in Dallas, Texas, an education leadership non-profit organization. Every day, QSR International helps 1.5 million researchers, marketers and others to utilize Qualitative Data Analysis (QDA) to uncover deeper insights contained within the “human data” collected via social media, consumer and community feedback and other means. We give people the power to make better decisions by uncovering more insights to advance their area of exploration.

The Power of Research and Truth for UX professional

NVivo shows its power in providing defensible findings, in the field of User Experience research.

Background

Zoe Rosen is an anthropologist working in the field of UX (User Experience) Research, where the researcher applies various techniques in order to give context and discover insights. These findings are then used to inform the design process for a product or process. This type of research is needed to discover new information about users, learn the facts, and find any current issues. UX research also helps to understand the users themselves, their needs and is used to identify the requirements of the product.

UX research is incredibly important for anyone designing anything between a product, an interface, content, or a physical interactive service. UX Research provides organizations with insights that empower them to invest capital wisely, by creating something their customers will want, based on tested and researched ideas – not just assumptions.

Zoe works as a UX Researcher contracting to different companies, generally as a part of the strategy and service design teams, and tends to work in one industry on an overarching research piece from 3 to 12 months at a time. She has uncovered important insights for companies across the banking, insurance, telecommunications and medical devices industries, to name a few.

“Because I move between organizations, and industries quite frequently, often I’m not an expert about the specific industry I’m working in. Consequently, I don’t always know what themes I should be looking for, and NVivo is extremely helpful in allowing me to identify those” said Zoe.

Working with NVivo

Being project based work, Zoe often has a limited time in which to complete her research, but also needs to complete it with a high level of depth and rigor, which is where NVivo has been really helpful. “Without NVivo there are so many times where I just would have had too much data and nothing to help me deal with it!” she said.

Zoe’s work has allowed her to uncover unexpected customer insights, that have inspired overall strategy change within the organizations she’s working with. “While working on an insurance project, the company held the misconception that none of their customers read the ‘Product Disclosure Statement’, it was part of their internal story. My work was able to show that not only did more than half of them read it, but they spent time highlighting it, and made notes on it and shared those details” said Zoe.

Using NVivo to analyze and store her data, allows her to easily call on and defend her research when her findings are challenged by her clients. “Often I am reporting findings to staff quite high up the chain of command and it’s great to actually be able to change their perception of the front line of their business. They might have worked in that position ten years ago, but a lot has changed since then, and I have the data to back it up” said Zoe.

Presenting results to clients

NVivo’s visualizations have been key in the way that Zoe’s results are received by her clients. “They’re often quite blown away when the results are presented with NVivo. I can see the change in the business. It changes the way they see things, it changes how they talk, they start changing their goals” she said. Zoe’s work allows her be the voice of the customer, who might otherwise not be being heard or understood by the organization.
Zoe’s reporting stacks up against larger research and design organizations. In one instance, she met with a company who were also consulting at a firm she was contracted at.

Impressed by her reporting and visualizations, they enquired further about her research. Zoe recounts being surprised they weren’t using the same tools as her; “I dug a little deeper and asked them what they used, and how they came to their conclusions in their research work. They told me ‘We don’t do anything like this, we go a little bit more on our intuition’ which really disappointed me, because they’re probably charging a lot more than me, and that’s not research. Intuition is not satisfying, you need data!”

Frequently at a projects conclusion, Zoe will present the results that she has worked through on NVivo, and find the heads of departments are moved to realign their goals. They realize through the uncovered latent opportunities that they could be the first in the industry to fulfil a customer need, and they need to act on it first. “They need to be prepared to take a leap forward. They become of the mind that if they can move on it quickly, they can really be ahead of the game” said Zoe.

There's a lot of power in truth, and at the end of the day, it's what has given the organizations that invest in a UX researcher like Zoe a commercial advantage. 

ABOUT THE AUTHOR

QSR International with Zoe Rosen

Zoe Rosen is a User Experience researcher, with a background in cultural and behavioral anthropology studies. It’s a discipline that she lives and breathes, and loves the research, the ethnographic style, the broadness, the way anthropology is completely dedicated to how people behave and interact with everything around them. Zoe has a wide experience in many industries from banking and insurance, to healthcare. Every day, QSR International helps 1.5 million researchers, marketers and others to utilize Qualitative Data Analysis (QDA) to uncover deeper insights contained within the “human data” collected via social media, consumer and community feedback and other means. We give people the power to make better decisions by uncovering more insights to advance their area of exploration.

Research with a conscience

In its purest form, data manipulation is the process of changing data in an effort to make it easier to read or be more organized. For example, a log of data could be organized in alphabetical order, making individual entries easier to locate. But what happens when data manipulation is not handled ethically? Controversies around Cambridge Analytica, Facebook, international fraudsters, and identity thieves have made us aware of how technology allows for our data to be manipulated.

As researchers, it’s vitally important that we’re aware of what it means to acquire and handle data ethically, especially in the face of constantly evolving technology. In this piece, we’ll look at how data can be manipulated, what it means to ethical, and how data manipulation can raise questions for researchers when it comes to technology solutions.

Data manipulation in the wild

No strangers to controversy, social media giant Facebook has walked a fine line of protecting and exploiting the data it’s the custodian of. In 2012, the organization came under fire when it was revealed that Facebook conducted a study of over 689,000 users without their knowledge or consent.

The study involved the manipulation of the users feed, to remove either positive or negative sentiment posts over the course of approximately a week, to observe how the user then posted as a result. One test decreased the users' exposure to their friends ‘positive emotional content’, which resulted in fewer positive posts of their own. Another test reduced their exposure to ‘negative emotional content’ and the opposite happened.

The study concluded:

"Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."
 
The outrage from the academic community however, was centered on the conduct of the study, rather than the issue of data privacy alone.

It was not merely an observational study, which could be argued, since users consent in the acceptance of the Facebook terms of service. This particular study involved an intervention (i.e., the manipulation of the newsfeed), which lacked the element of informed consent for the participants.

This in itself is not necessarily unethical, as studies with interventions can be permitted on the grounds that such research aims could not be achieved any other way. However, there would be a number of standards to be met in order for such research to pass any kind of ethics test.

In the case of Facebook’s study, these guidelines were not followed or met, and it could be reasonably argued that the study was therefore unethical. 

The potential for further misuse of this kind of manipulation of data, beyond a study of its outcomes is cause for concern. When the story initially broke in 2014, Clay Johnson, the founder of Blue State Digital, the firm that managed Obama’s online campaign for the US Presidency in 2008 asked, “Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting ‘upworthy’ posts from two weeks beforehand? Should that be legal?”.

These are certainly all relevant questions which have come further into our consciousness and political discourse, given the somewhat turbulent and divided global political climate.

Data manipulation and research

What does this mean for researchers in academic institutions at all levels, particularly those who are interested in utilizing technology to further their outcomes? Data is often ‘manipulated’ (in the truest sense), to make it more usable with technology solutions that help researchers delve deeper into their sources.

Researchers understand the impetus of being ethical in all research, but when it comes to technology that is designed to make decisions on your behalf using algorithms and artificial intelligence, you could be forgiven for feeling like you’re taking a leap of faith into unknown territory.

The key to remaining on the right side of ethical standards, and being able to utilize technology as it becomes available to you, is transparency and control.

For example, the automation of transcription has long been on the wish list of many qualitative and mixed methods researchers, who have either spent many long hours of their own time, or struggled to find research assistants to transcribe interview data on their behalf. Advances in artificial intelligence (AI) and natural language processing technology have now made this a reality, and human powered transcription is no longer a researcher’s only option.

One of the advantages of utilizing transcription powered by AI and natural language processing, is the transparency in your final source. The transcription is verbatim, as opposed to an interpretation or summary of what was said from a human point of view. This means when it comes to the analysis of your data, you’re analyzing a verbatim written version of your recorded audio source.

Transparency and control are key

Taking an ethical approach to your research work, whilst also being able to take advantage of the technology offered to you is a matter of being able to maintain transparency and control over your sources.

In a digital climate that is plagued with data scandals, privacy issues and a district lack of transparency, it’s imperative the research community are not excluded from the use of new technologies, but that they are developed in a way that maintains the high standards expected by researchers.

Learn more about research transparency today in this free whitepaper Transparency in an Age of Mass Digitization and Algorithmic Analysis.

Natural Language Processing

When Joaquin Phoenix fell in love with ‘Samantha’, in the 2013 film ‘Her’, he sets about creating a meaningful relationship with an operating system that is artificially intelligent, and able to communicate with him in a language he can understand.

At the time the film was released, Apple’s Siri technology had been in the market, and in the hands of users for about two years, so the concept of speaking to a ‘smart’ device, and having it speak back to you wasn’t something entirely foreign to audiences. In the world Spike Jonze created in ‘Her’, this technology had evolved far enough that a human was able to develop a real emotional connection to it.

In reality, we’re not quite at the point where an exchange with your computer or smart device may lead you to romantic feelings, but it does make us consider where the technology is headed.

What is Natural Language Processing?

The technology that drives Siri, Alexa, the Google Assistant, Cortana, or any other ‘virtual assistant’ you might be used to speaking to, is powered by artificial intelligence and natural language processing. It’s the natural language processing (NLP) that has allowed humans to turn communication with computers on its head. For decades, we’ve needed to communicate with computers in their own language, but thanks to advances in artificial intelligence (AI) and NLP technology, we’ve taught computers to understand us.

In a technical sense, NLP is a form of artificial intelligence that helps machines “read” text by simulating the human ability to understand language.?NLP techniques incorporate a variety of methods to enable a machine to understand what’s being said or written in human communication—not just words individually—in a comprehensive way. This includes linguistics, semantics, statistics and machine learning to extract the meaning and decipher ambiguities in language.

How is it used?

Frequently used in online customer service and technical support, chatbots help customers speak to ‘someone’ without the wait on the telephone, answering their questions and directing them to relevant resources and products, 24 hours a day, seven days a week.

In order to be effective, chatbots must be fast, smart and easy to use, especially in the realm of customer service, where the user's expectation is high, and if they’re experiencing a technical issue, their patience may be low. To accomplish the expected level of service,?chatbots are created using NLP?to allow them to understand language, usually over text or voice-recognition interactions,?where users communicate in their own words, as if they were speaking (or typing) to a real human being. Integration with semantic and other cognitive technologies that enable a deeper understanding of human language allow chatbots to get even better at understanding and replying to more complex and longer-form requests.

In a research context, we’re now seeing NLP technology being used in the application of automated transcription services (link out NVivo transcription). Transcription is one of the most time-intensive tasks for qualitative, and mixed methods researchers, with many transcribing their interviews and focus group recordings themselves by hand. Unless you’re an incredibly fast and accurate typist, this is an incredibly laborious task, taking researcher’s time away from the actual analysis of their data.

Automated transcription tools utilize NLP technology to ‘listen’ to recordings of data such as focus groups, and interviews, and interpret them and produce them into a format and language that is useful for the researcher to go on and analyze, either manually, or using software.

Future uses of NLP

The NLP market size is estimated to grow to USD 16.07 Billion by 2021, globally, giving us a strong indication that NLP technology has huge growth opportunities across a number of sectors.

An understanding of human language can be especially powerful when applied to extract information and reveal meaning or sentiment in large amounts of text-based content?(or unstructured information), especially the types of content that has typically been manually examined by people.

Analysis that accurately understands the subtleties of language, for example, the choice of words, or the tone used, can provide useful knowledge and insight. NLP will play an important part in the continued development of tools that assist with the classification and analysis of data, with accuracy only improving as technology evolves.

Academics at the University of Bologna have applied NLP to the most used part of any academic article: the bibliography. A group of researchers are developing tools that can extract information on citations using natural language processing and common ontologies (representations of concepts and their relationships) that can be openly accessed and connected to other sources of information. The idea of the project is to enrich the bibliography in order to give the reader more comprehensive information about each single entry, instead of looking at the bibliography as one large piece of information.

In the commercial word, NLP analysis will have uses especially in the analysis of the typically carefully worded language of annual reports, call transcripts and other investor-sensitive communications, as well as legal and compliance documents. Effective analysis of sentiment in customer interactions will allow for organizations to make improvements in their product and service delivery outcomes.

NLP will be essential to the future of research

More effective and accurate understanding between humans and machines will only strengthen the efficiencies and outputs of those who need to understand and analyze unstructured data.

No matter where it is applied, NLP will be essential in understanding the true voice of the research participant, the customer, or the user and facilitating more seamless interaction and interpretation?on any platform where language and human communication are used.

To read more about automation, AI technology, and its effect on the research landscape, download this free whitepaper Transparency in an Age of Mass Digitization and Algorithmic Analysis.

Navigating Online Qualitative Research

Dr. Janet Salmons interviewed by Dr. Stacy Penna, Engagement and Enablement Director

NVivo Podcast Between the Data Ep 44

Launched by SAGE Publishing in 2009, MethodSpace is an online community for the discussion of social and behavioral research methods, which gives scholars and students a space to share experiences and solve problems on a global scale.

Dr. Janet Salmons is MethodSpace’s Research Community Manager, a methodologist, or as she likes to call herself, “a researcher about research”, applying academic discipline to the activity of conducting research. She is also the author of “Doing Qualitative Research Online”, an acknowledged leader of the field.

In the latest episode of the NVivo Podcast, “Between the Data”, QSR’s Engagement and Enablement Director, Dr. Stacy Penna learns Dr. Salmons’ best-practice advice for conducting research online both effectively and ethically.

>> Listen to the Full NVivo Podcast Here

Dr. Salmons has been charting the opportunities and challenges of conducting qualitative research online since the early 2000’s. Across 20 years, the tools at the researcher’s disposal have ballooned: audio, webinars, chats, whiteboards and more. That’s just the one-to-one methods; add solicitations for comment on platforms like Twitter and further new paradigms emerge: clearly contributors will speak differently in public and in the constraints of 280 characters than they will in a more private and nuanced in-depth interview.

Janet says it’s therefore crucial to be both consistent and mindful of the medium used: “Every part of the research design needs to be approached differently when you are conducting it online. You need an iterative, holistic approach to think it through. Because once you make one decision, it will have implications for the other design decisions.”

Similarly, the online environment will present new ethical dimensions. Says Dr. Salmons, “On some social media platforms, if you are collecting data there, you can’t promise your IRB or ethics review board that you can protect that data - because someone else owns it.” But that’s only the start. Dr. Salmons’ book devotes three chapters to research ethics, but the fundamental challenge of the online environment is that the ability to reach more subjects than ever creates a distance and impersonality that might allow standards to drop. “I ask people to think about the integrity of the role of the researcher”, Dr. Salmons, “because nobody’s looking. What are your own values? And especially when the answers you’re getting are not the answers you were hoping to get, are you going to be true to the data?”

This is particularly important as, in a world where socio-politically we appear to be living in a time of public mistrust, Salmons feels that academics have a role to play in being cheerleaders for rebuilding that trust on the firm foundation of discipline and facts.

NVivo's role in conducting research at scale

Dr. Salmons says, “Whether you are a student or an experienced researcher, part of the challenge is just keeping track of all of the materials.”

But tools like NVivo are more than just a useful repository. Given her concern for a consistent approach to all participants in research and across what may be multiple platforms and sources of data, NVivo’s support for multiple media types means that qualitative researchers can both glean more insight and ensure rigorous review of the data.. She gives the example of video interviews, where the background – from the books on a shelf or the organized/chaotic nature of a room can be hugely informative. These are the visual cues which get lost in a pure transcription.

“It’s wonderful to be in the physical presence of other people”, says Salmons, but if you’re doing an interview on a video-conferencing platform, you can record the visuals, diagrams, photographs and artifacts that you’re discussing. You can see the response, a person’s facial expressions and nonverbals, and it’s all recorded in one place.” In this sense, NVivo allows for dramatically more analysis of multiple types of date for the researcher.

Listen to the NVivo Podcast for more of Janet Salmons’ taxonomy of online qualitative research and how NVivo is helping researchers make the most of academics’ digital assets.

To learn more about Dr. Salmons’ advice for online qualitative research, watch this on demand webinar Connecting for Collecting Data: Qualitative Research Online with Human Participants.

Thematic Analysis Is More Popular Than You Think

The practice of thematic analysis is widely used in qualitative analysis but sometimes not acknowledged or confused with other approaches. Here at QSR International we break down the ambiguities of thematic analysis as an approach, and hope this interpretation can breathe new life into it, as new and emerging forms of content become more integral to the already established research tool.

What is thematic analysis?

Thematic analysis is not a methodology but a tool which can be used across different methods (Boyatzis 1998). It is used to find common themes in content such as:

This practice is dynamic. It can be done manually (by hand), in Excel or through a Computer Assisted Qualitative Data Analysis (CAQDAS) software tool. It traverses traditional qualitative research and quantitative data, allowing researchers to ask more questions of their content.

>> Watch Webinar to Learn how you can use NVivo for Thematic Analysis

When might you choose to do thematic analysis?

Put simply, you may be looking for the right way to explain or express patterns in your content. Consider this example: you are analysing representations of women on social media. You want to collect data from Facebook, Twitter and YouTube as rich datasets so you can access the online conversations and content about your research, organization or topic of interest, but also the valuable data behind the comments, like demographics and locations.

The challenge with importing, managing and analyzing different content types is how do you find the similarities or differences in the media before you? What do you do with it then?

What are the benefits of sifting through content?

Thematic analysis helps you find connections in your content and understanding the underlying themes to help inform decisions. Use queries to ask complex questions and identify new meaning in your data. Test ideas, explore patterns and see connections between themes, topics, people and places in your project. Look for emerging themes, find words and discover concepts using text search and word frequency queries.

Thematic analysis can be used as a technique on its own or it can be used as a first step in a variety of methodological approaches to analysing qualitative data including:

Once you do this, you can search for content based on how it's coded using coding queries. Check for consistency and compare how different users have coded material using coding comparison queries. Cross-tabulate coded content and explore differences in opinions, experiences and behaviour across different groups using matrix coding queries.

How do I visualize my data?

By visualizing your insights, you can explore even further. Get a sense of the larger trends happening and dive in deeper. Discover a new perspective. Identify new and interesting themes. Share results with others.

Visualizations can also provide an easy way to communicate findings with broader audiences.

Why should you do a thematic analysis?

Easily understand how content plays a role in influencing decisions or behaviours.

How do I get started analyzing content and visualizing my insights?

Gain an advantage with NVivo – powerful software for qualitative data and content analysis that helps you make insight-driven decisions.

NVivo has a wide range of visualizations. Below are a few which are particularly useful to thematic analysis:

Editors note: This blog was originally published in March 2017, and was updated in February 2022 for accuracy.

For more information about thematic analysis see these resources:

Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Thousand Oaks, CA: Sage.

Braun, V. & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, Qualitative Research in Psychology, 3(2), 77–101.

Teaching thematic analysis

Thematic analysis of interview data: 6 ways NVivo can help

On any given day, close to 80% of NVivo users are busy analyzing interviews.

Like you, they're looking for the best way to make sense of their data so they can deliver robust results.

If you’re new to NVivo or just getting started with analyzing interviews (thematic analysis) it can be hard knowing where to start. These tips can help guide you on your way.

>> Watch Webinar to Learn how you can use NVivo for Thematic Analysis

#1. Transcribe the interview recordings

Let's face it, transcribing can be tricky - particularly if you're working with a lot of interviews or have limited time (and budget).

Luckily, NVivo gives you a number of options. For example, you can import the interview recordings into NVivo and:

>> Free Transcription: Try NVivo Transcription free today

#2. Group the responses to each question

If your interviews have already been transcribed, you can import the documents directly into NVivo and get started on analysis.

If you’re working with semi-structured interviews (where participants are asked the same set of questions), you can use heading styles to automatically organize the responses.

For example, you can gather all the responses to Question 1 in one place for easier analysis.

Refer to Automatic coding in document sources for step-by-step instructions.

If your interviews are more free-ranging and conversational, you can use other tools to organize the content by theme.

#3. Find and catalogue themes to make sense of the data

Thematic analysis involves making sense of what your interview participants are saying:

NVivo gives you ways to get a broad feel for what themes are in the data and it also lets you drill down into the material for deeper analysis.

For example, you can run a quick Word Frequency query to see which words your participants are using most often.  The resulting word cloud can give you early insight into emerging themes – and it’s a fun way to ease yourself into analysis:

Taking a more thorough approach to thematic analysis, you can read through each interview and ‘code’ the emerging themes. This involves selecting interesting comments and putting them into containers called ‘codes’. At any time, you can open a code to see all the references you’ve gathered there.

NVivo offers plenty of ways to speed up the coding process – you can use the Automated Insights feature to find themes automatically.

Refer to About Automatic Coding Techniques for the details.

#4. See the connections between themes and move toward analytical insight

As you code your material by theme, you’ll start to develop a list of codes. At regular intervals, you can groom this list – checking whether related themes could be grouped together in a hierarchy.

This is not just ‘good housekeeping’ – it’s a vital step in the analysis process and helps you to see the connections between emerging themes:

You can open any code (by double-clicking on it) to view all the content gathered there but you can also run queries to retrieve your data in revealing ways.

For example, you could see where participants talked about ‘water quality’ in terms of ‘development’ – or where ‘policy’ came up in discussions about ‘water quality’.

NVivo lets you query and visualize your data in all sorts of ways – refer to Move forward with queries and visualizations to find out more.

#5. Make comparisons between participants

If you want to compare what your interview participants say based on attributes like age, gender or location – then you can create a ‘case’ for each person and assign the demographic attributes.

This video gives you a quick overview of how cases work in NVivo:

Creating ‘cases’ for interview participants, paves the way for powerful queries and visualizations. For example, you could create a matrix to see how men and women respond to a selection of themes:

#6. Stay organized and focused on your research design

In the thick of data analysis, it can be easy to lose sight of your research question.

Gathering your material into theme codes and organizing these codes in a ‘sensible’ hierarchy helps you to stay organized and focused.

Mind maps in NVivo are another great way to consolidate your thinking. For example, you could visualize your conceptual framework in a mind map and update it as your thinking evolves:

You should also consider creating a project journal in NVivo.

Keeping an audit trail of your challenges, assumptions, decisions and epiphanies will come in very handy when your supervisor (or client) asks a difficult question.

The journal tells the story of your project, makes your decisions transparent and helps you avoid that terrifying blank page when it comes to writing up.

Editors Note: This blog was originally published in October 2016 and was updated in February 2022 for new information and accuracy.

Find out more

These 6 approaches to thematic analysis are just the tip of the iceberg and we’ll expand on them in upcoming posts.

To find out more – check the online help, watch video tutorials or join the NVivo Users Group on LinkedIn.

How about you? What are your tips for analyzing interviews? Tweet us at @NVivoSoftware using the #NVivotips hashtag.

ABOUT THE AUTHOR

Kath McNiff

Kath McNiff is on a mission to help researchers deliver robust, evidence-based results. If they’re drowning in a sea of data (or floods of tears) she wants to throw them an NVivo-shaped life raft. As an Online Community Manager at QSR, she knows that peers make the best teachers. So, through The NVivo Blog, Twitter and LinkedIn, she shares practical advice and connects researchers so they can help each other. When she’s not busy writing blog posts, swapping stories on social media or training the latest tribe of NVivo users, she can be found wrestling four feisty offspring for control of the remote.

6 tips to maximize your funding

Across all disciplines, there is growing pressure to obtain grants to advance research. There are many reasons why funding is necessary to further research, from conducting trials, analysis and testing, to acquiring equipment, creating prototypes or paying for researchers’ time and travel. It all costs money.

Let’s face it – applying for research funding is tough. It can take months to write a decent proposal, and with so much competition, the chances of getting funding are often slim.

As to exactly how funding recipients are selected, many governments and private organizations use a system to ensure researchers are scored using the same measure, such as bibliometric and citation benchmarks. Some in the research community suggest being published in a highly reputable journal holds more weight than the actual contribution or impact of the work on the research community. Others say funding favors older and more reputable researchers over younger, innovative research groups, such as PhD and Masters students, and that this needs to change.

>> Try NVivo Free for 14 Days

Regardless, researchers at any stage of their career can significantly improve their chances of a  successful submission by using the right tools to support both their submission and their research project.

#1 Keep all source information in one place

Grant writing is a dynamic process, which is constantly moving and changing as team members discuss the best approach, provide comments and input. And yet, creating a funding submission is dependent on many static objects and documents in folders and subfolders – from Word documents to PDFs, spreadsheets, videos, social media comments and more. Then there are all the reference materials and citations, emails and funding agency sites to keep track of. Consequently, researchers often find that they work with information all over the place. The idea that you can take all of this information and store and work with documents in one place can make a significant difference in the speed and efficiency of completing a proposal.

This allows you to collect and collapse documents into one view so you can explore themes and build a comprehensive, coherent structure. From there, you can create categories and a hierarchical structure, allowing your paragraphs to flow in a logical manner and providing easy access to relevant quotes.

#2 Focus on the quality of your work

Be clear on what is important about your research and how you’ll go about it. To ensure the quality of your work, use tools to categorize and classify data, including auto coding and data summary options like framework matrices. Coding brings together different themes and topics, or entities like places and people, into a single category. This makes it easier to search, compare and identify patterns in your data so that you can develop rigorous research that stands up to scrutiny.

#3 Create a living project

Begin with feedback from a prior proposal or multiple drafts of a proposal to create a history in your software platform. From there, you can track and manage your implementation of the project. Input any updates from the grant provider so the information is all in one place.

If your project is successful, you can import the award documents and reporting requirements you receive and track your funds right within the project. If you are rejected, it doesn’t mean your idea will never achieve funding. It might be that you need to make some changes, or that there was just not enough funding available in the round you were in. Don’t give up.

#4 Collaborate in real-time

In most cases, there is more than one person working on a grant application. For tracking purposes, the right software can show the dates of creation and modification as well as the initials of the editor in the landing pad. It’s very easy to see who has done what and when. It also allows you to collaborate in real time and see other researcher’s updates as you work.

#5 Network

One recent study revealed that it’s more important for researchers to build their collaboration network over what publications they produce or where they are cited. It suggests that well known, highly-reputable researchers with an established record of success are more likely to receive higher amounts of funding. If this is indeed true, then it pays to keep track of your grant writing efforts and build respect within the academic and research community. Both your organization and your own career will have been built on the expertise that you and your research team have developed over years of hard work. Use tools to track your successes and any attempts toward helping you and your organization be successful. This can also help demonstrate to the research community your determination, rigor and efficiency in applying for grants.

#6 Show your productivity

According to the above report, the past productivity of researchers also has a positive effect on funding. It revealed ‘a positive relationship between both quantity and quality of their papers on the amount of funding received.’ In other words, the more productive researchers are, the more likely they are to receive a higher amount of funding. This level of productivity can be captured and tracked via software to reflect different phases of your projects and how they have evolved.

Finally, get others to proof your work – even those outside of the research community. They may identify new angles you had not thought of.

Don’t be afraid to ask questions of the grant provider – there is always an opportunity to get in touch and ask whether they think your idea has legs enough to apply. If you do end up applying, don’t forget to read the rule book. It may seem minor but adhering to font size and spacing are little things that all add up in the long run. Good luck!

Editors Note: This post was first published in November 2019 and was updated in February 2022 for accuracy.

10 Tips for Writing a Literature Review

Conducting a thorough critique of the literature is incredibly important, but as a writer, you may feel daunted by the enormity of the task. Following these 10 tips can help you focus your writing efforts. These tips can also help you write a literature review that moves beyond summarizing the research and toward critiquing it well.

Tip 1. Understand what a literature review is.

A literature review is a well-reasoned, evidence-based, scholarly argument that demonstrates the need for your study. While your literature review will contain a great deal of information, it is not (primarily) an informative text. Keeping this in mind at the outset can lead you toward a critique that situates your study within the scholarly discourse relevant to your research topic.

Tip 2. Write a draft of your research problem statement.

A well-written literature review thoroughly analyzes and critiques the key concepts or quantitative variables central to your research topic. These key concepts or variables are generally expressed in a problem statement, so having a problem statement drafted can help you align your literature review to your research topic. For instance, rather than writing about “Burnout in Education,” your problem statement could lead you to focus your review on “Burnout in K-12 School Leaders.” This narrowed focus makes your literature review relevant and, importantly, doable.

Tip 3. Create an outline of your literature review.

Even though your outline is likely to change, create a document with headings that describe the pockets of literature you will review. In the above example about burnout in school leaders, you might have a heading called "Factors Influencing Burnout." You might already know that some factors to consider are lack of work/life balance, lack of resources, and dissatisfaction with pay and benefits. Create those subheadings. Then, follow the next tip.

Tip 4. Use your outline to guide your search.

The headings in your lit review outline can be used as keywords to search for relevant literature. Remember to document your search strategy and use synonyms. You might also locate a systematic review on your research topic, which is rich with references.

Tip 5. Organize your research articles.

We recommend using reference management software such as Citavi to organize your research articles. If that isn’t an option, create folders and save your research articles as the in-text citation (e.g., an article by Parker et al. 2021 would be saved as such). Having one folder for all of your articles is the equivalent of piling your desk with stacks of articles that you can't remember if you have read or not. If you organize your research articles, you will be able to review all of the articles that relate to a specific topic in your literature review.

Tip 6. Use an annotation table to document relevant study information.

This step is critical to literature review success. You will search for trends in the literature. Therefore, you need to extract relevant information from articles and group this information together to analyze it. Writers often begin by sharing the results of one study, then the next, and so on, without offering up any synthesis of the literature. Synthesis is the result of analysis, and analysis needs to encompass articles that are grouped in some way. In the burnout example above, you may have extracted several findings that demonstrate that lack of work/life balance is a major factor in school leader burnout. You will want to state this finding clearly and review all of the articles about it together, so go ahead and group them in an annotation table at this stage.

Tip 7. Analyze your annotation table.

Once you have annotated several articles, analyze them for patterns, discrepancies, and gaps. A pattern could be a similar finding that you have noticed across several studies. It could also be a pattern of participants (e.g., the phenomenon has mostly been studied in female-identifying participants) or methodology (e.g., 10 of the 12 studies are quantitative). Often, we can infer from a pattern to identify a gap in the literature. NVivo can help you find the patterns and themes in your literature and keep you organized throughout the process of synthesizing literature.

Tip 8. Write clear and concise synthesis statements.

So, you located a pattern, discrepancy, or gap in the literature, what next? Make sure that you state your finding clearly and concisely in the form of a synthesis statement. For instance, "Much of the research regarding school leader burnout focuses on the reasons why school leaders burnout" is a synthesis statement. Reporting that a single author "X" found something interesting is not.

Tip 9. Place your synthesis statements “front and center” in your writing.

As you report your findings, place your synthesis statements as topic sentences (main ideas) of the paragraphs you write. Then put the evidence you pull from your studies to support that main idea. A hallmark of well-synthesized writing is that paragraphs weave information from several studies together around a central claim. Using the MEAL plan structure (Main Idea, Evidence, Analysis, Link) can help you craft paragraphs that are cohesive and analytical — hallmarks of good literature review writing.

Tip 10. Schedule time for revision.

When you are writing your literature review, you are wielding large amounts of information, and you are likely writing in complex ways that are likely new to you. As with all writing, expect that you will need to revise your work. Schedule time and, if necessary, ask for help about areas that you need to revise. Then, systematically, dive into your writing (e.g., do not revise for everything at once).

The above tips are important because they provide much-needed structure for you as you write your literature review. Often, writers set out with vague notions about what a literature review is, and the process begins to feel amorphous. These tips can help you break the process of writing a literature review down, bringing focus to the writing process. Return to this list again and again if you feel lost in “literature review land.” They will help you regain your footing and return to your writing with a renewed sense of clarity.

ABOUT THE AUTHOR

Dr. Desi Richter is an educational researcher and dissertation coach located in New Orleans, Louisiana. One of her greatest joys is working with researchers to help them develop their studies from inception to completion. She especially enjoys helping research scholars write about their research in ways that clearly communicate their research agenda and foreground their analysis. When Desi is not coaching doctoral students through their research, she conducts professional development for local teachers in her role at a local non-profit, AfterCLASS.

Her research interests include curricular narrowing, arts-based research, and social-emotional learning. She conducted an arts-based study using musical inquiry for her doctoral research and wrote her study findings as a cycle of thematically linked, narrative songs. Most evenings, you will find Desi writing songs, strolling along the bayou, or listening to live local music in the French Quarter.