Conducting a thorough critique of the literature is incredibly important, but as a writer, you may feel daunted by the enormity of the task. Following these 10 tips can help you focus your writing efforts. These tips can also help you write a literature review that moves beyond summarizing the research and toward critiquing it well.
Going beyond a summary and creating a discussion of published work can be accomplished through the thorough tracking of sources and the source highlights and links. While this can sound tricky and time consuming when referencing hundreds of source materials ranging from journal articles and books to research papers and videos, reference manager and writing tools like Citavi can help streamline this process (and more as we’ll cover in the upcoming tips).
A literature review is a well-reasoned, evidence-based, scholarly argument that demonstrates the need for your study. While your literature review will contain a great deal of information, it is not (primarily) an informative text. Keeping this in mind at the outset can lead you toward a critique that situates your study within the scholarly discourse relevant to your research topic.
Learn more in the on-demand webinar Conducting and Constructing a Literature Review for Maximum Impact.
A well-written literature review thoroughly analyzes and critiques the key concepts or quantitative variables central to your research topic. These key concepts or variables are generally expressed in a problem statement, so having a problem statement drafted can help you align your literature review to your research topic. For instance, rather than writing about “Burnout in Education,” your problem statement could lead you to focus your review on “Burnout in K-12 School Leaders.” This narrowed focus makes your literature review relevant and, importantly, doable.
Learn more about writing a compelling argument and developing your voice in the free on-demand trainings from the Research and Technical Writing Institute.
Even though your outline is likely to change, create a document with headings that describe the pockets of literature you will review. In the above example about burnout in school leaders, you might have a heading called "Factors Influencing Burnout." You might already know that some factors to consider are lack of work/life balance, lack of resources, and dissatisfaction with pay and benefits. Create those subheadings.
If you use a reference manager like Citavi, you can breeze through this step! With Citavi, you can save your sources directly in the program, create your literature review outline within the knowledge organizer, then export it to Word.
The headings in your lit review outline can be used as keywords to search for relevant literature. Remember to document your search strategy and use synonyms. You might also locate a systematic review on your research topic, which is rich with references. If you have Citavi, data bases like Scopus and EBSCCO integrate with the software – letting you easily search for sources. You can also use the Citavi Picker which helps bring sources in from sites like Google Scholar by identifying ISBNs and DOIs on web pages and sending reference information to your Citavi project.
We recommend using reference management software such as Citavi to organize your research articles. This saves you tremendous time as Citavi helps you methodically manage quotes, sources, notes, and articles.
If that isn’t an option, create folders and save your research articles as the in-text citation (e.g., an article by Parker et al. 2021 would be saved as such). Having one folder for all of your articles is the equivalent of piling your desk with stacks of articles that you can't remember if you have read or not. If you organize your research articles, you will be able to review all of the articles that relate to a specific topic in your literature review.
Learn more in this on-demand webinar Organizing Information in Your Field of Study.
This step is critical to literature review success. You will search for trends in the literature. Therefore, you need to extract relevant information from articles and group this information together to analyze it. Writers often begin by sharing the results of one study, then the next, and so on, without offering up any synthesis of the literature. Synthesis is the result of analysis, and analysis needs to encompass articles that are grouped in some way. In the burnout example above, you may have extracted several findings that demonstrate that lack of work/life balance is a major factor in school leader burnout. You will want to state this finding clearly and review all of the articles about it together, so go ahead and group them in an annotation table at this stage.
An alternative to the annotation table is Citavi’s knowledge organizer which essentially replaces an annotation table. This feature in Citavi lets you save notes, memos, and quotes from articles in the knowledge organizer while still linking to the original source. Even better, you can add categories to your notes, memos, and sources based on your keywords and themes.
Once you have annotated several articles, analyze them for patterns, discrepancies, and gaps. A pattern could be a similar finding that you have noticed across several studies. It could also be a pattern of participants (e.g., the phenomenon has mostly been studied in female-identifying participants) or methodology (e.g., 10 of the 12 studies are quantitative). Often, we can infer from a pattern to identify a gap in the literature. Using NVivo in your literature review can help you find the patterns and themes in your literature, piece together which researchers often write together, and keep you organized throughout the process of synthesizing literature.
Learn more in the on-demand webinar Accelerating your Literature Review with Citavi & NVivo 14.
So, you located a pattern, discrepancy, or gap in the literature, what next? Make sure that you state your finding clearly and concisely in the form of a synthesis statement. For instance, "Much of the research regarding school leader burnout focuses on the reasons why school leaders burnout" is a synthesis statement. Reporting that a single author "X" found something interesting is not.
As you report your findings, place your synthesis statements as topic sentences (main ideas) of the paragraphs you write. Then put the evidence you pull from your studies to support that main idea. A hallmark of well-synthesized writing is that paragraphs weave information from several studies together around a central claim. Using the MEAL plan structure (Main Idea, Evidence, Analysis, Link) can help you craft paragraphs that are cohesive and analytical — hallmarks of good literature review writing.
Learn more in this on-demand webinar from the Research and Technical Writing Institute, Developing Your Voice: How to Paraphrase, Make Claims, and Synthesize Literature.
When you are writing your literature review, you are wielding large amounts of information, and you are likely writing in complex ways that are likely new to you. As with all writing, expect that you will need to revise your work. Schedule time and, if necessary, ask for help about areas that you need to revise. Then, systematically, dive into your writing (e.g., do not revise for everything at once).
The above tips are important because they provide much-needed structure for you as you write your literature review. Often, writers set out with vague notions about what a literature review is, and the process begins to feel amorphous. These tips, and reference management software like Citavi, can help you break the process of writing a literature review down, organize your notes and sources, automatically create citations, and bring focus to the writing process. Return to this list again and again if you feel lost in “literature review land.” They will help you regain your footing and return to your writing with a renewed sense of clarity.
Dissertation by Design is an organization that supports and coaches all types of writers including students, academics, and research professionals.
See how you can streamline your literature review by requesting a free demo of Citavi today
The practice of thematic analysis is widely used in qualitative analysis but sometimes not acknowledged or confused with other approaches. Here at Lumivero, we break down the ambiguities of thematic analysis as an approach and hope this interpretation can breathe new life into it as new and emerging forms of content become more integral to the already established research tool.
NVivo offers a powerful solution for conducting thematic analysis due to its robust features and easy-to-use interface. The software is designed to enable researchers to quickly and accurately analyze large amounts of data and uncover underlying themes.
Thematic analysis is not a methodology but a tool which can be used across different methods (Boyatzis 1998) and was first labeled as an approach in the 1970s (Merton, 1975). It is used to find common themes in content such as:
This practice is dynamic. It can be done manually (by hand), in Excel, or with thematic analysis software or Computer Assisted Qualitative Data Analysis (CAQDAS) software tool. It traverses traditional qualitative research and quantitative data, allowing researchers to ask more questions of their content and conduct thematic analysis from large data sets like interviews.
In Methods: Teaching Thematic Analysis, Virginia Braun and Victoria Clarke describe viewing thematic analysis as “theoretically flexible because the search for, and examination of, patterning across language does not require adherence to any particular theory of language, or explanatory meaning framework for human beings, experiences or practices" -- allowing for thematic analysis to be applied within a wide variety of theoretical frameworks.
Thematic analysis as a tool is especially versatile as it can be helpful for those new or experienced in research and for its ability to be used in a range of research categories and theoretical perspectives.
In the Thematic Analysis Using NVivo 14 webinar, Ben Meehan Ph.D. discussed Braun and Clarke’s six steps of thematic analysis.
To learn more about using NVivo 14 for thematic analysis, watch our on-demand webinar Thematic Analysis Using NVivo.
Put simply, you may be looking for the right way to explain or express patterns in your content. Consider this example: you are analyzing representations of women on social media. You want to collect data from Facebook, Twitter and YouTube as rich datasets so you can access the online conversations and content about your research, organization or topic of interest, but also the valuable data behind the comments, like demographics and locations.
The challenge with importing, managing and analyzing different content types is how do you find the similarities or differences in the media before you? What do you do with it then?
To better understand when to use thematic analysis and for general best practices for thematic analysis, check out the on-demand webinar by Braun and Clarke Introduction to Thematic Analysis.
Thematic analysis helps you find connections in your content and understanding the underlying themes to help inform decisions.
Braun and Clarke encourage thematic analysis as the starting method to teach students new to qualitative research. “[Thematic analysis] is accessible, flexible, and involves analytic processes common to most forms of qualitative research. Students can progress from [thematic analysis] to grounded theory, IPA and discourse analysis, or progress from producing largely descriptive [thematic analysis] to producing rich and complex, conceptually informed [thematic analysis].”
Thematic analysis encourages researchers to use queries to ask complex questions and identify new meaning in your data. Test ideas, explore patterns and see connections between themes, topics, people and places in your project. Look for emerging themes, find words and discover concepts using text search and word frequency queries.
Thematic analysis can be used as a technique on its own or it can be used as a first step in a variety of methodological approaches to analyzing qualitative data including:
Once you do this, you can search for content based on how it's coded using coding queries. Check for consistency and compare how different users have coded material using coding comparison queries. Cross-tabulate coded content and explore differences in opinions, experiences and behavior across different groups using matrix coding queries.
By visualizing your insights, you can explore even further. Get a sense of the larger trends happening and dive in deeper. Discover a new perspective. Identify new and interesting themes. Share results with others.
Visualizations can also provide an easy way to communicate findings with broader audiences.
Easily understand how content plays a role in influencing decisions or behaviours.
Gain an advantage with NVivo – powerful software for qualitative data and content analysis that helps you make insight-driven decisions.
NVivo has a wide range of visualizations. Below are a few which are particularly useful to thematic analysis:
NVivo 14 provides additional advantages for thematic analysis for interview and document analysis with these powerful features:
Editor's note: This blog was originally published in March 2017, and was updated in February 2022 and October 2023 for accuracy.
For more information about thematic analysis see these resources:
For more information about thematic analysis see these resources:
Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Thousand Oaks, CA: Sage.
Braun, V. & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, Qualitative Research in Psychology, 3(2), 77–101.
Merton, R.K. (1975). Thematic analysis in science: Notes on Holton’s concept. Science as Culture, 188(4186), 335–338.
Save valuable time in your data analysis with three new features and three feature improvements in part two of the latest XLSTAT release!
Now you can view all the automatically generated graph options before making your choice and incorporate tornado diagrams and back-to-back histograms to help you easily improve your analysis of groups. Plus, this release includes an update to the PROCESS feature which lets you use covariables, new models, and see conditional effects for deeper marketing insights.
Dig deeper into XLSTAT’s new features and improvements with the highlights below.
New Feature Available on Windows in Beta, Basic +, All Applied, and Premium
Automatically build the right graphs for your data! With the automatic data visualization feature, you don’t have to worry about which graph to use as it will propose several graphs according to your data. Simply select your quantitative and qualitative variables in the “Dataviz” dialog box and let the feature guide you. Once you’ve chosen your graph, click on “export”.
New Feature Available in All Solutions
Are your groups similar for all measurements, or perhaps only one or two are different? Make a tornado graph to find out! With the new tornado diagram feature, you can quickly compare the measurements of two groups and obtain a beautiful, informative tornado chart.
New Feature Available in All Solutions
Need to figure out if your groups have the same distributions? Compare the distributions of two groups back-to-back with this new feature located in Tornado charts.
Improved Feature Available in Marketing and Premium
Determine the conditional effects and use more sophisticated models with covariables to analyze mediation and moderation with the improved PROCESS feature. Plus, a new graph for visualizing the conditional effect of a variable on a mediator is now available!
Improved Feature Available in Marketing and Premium
More accurately determine the right sample size required for research and ensure that you achieve the desired level of precision with the improved sample size calculation feature. To do this, simply choose the standard deviation you want from the sample size dialog box.
Improved Feature Available in Forecasting and Premium
Quickly compare your models with XGBOOST in the Easy Fit feature which lets you run and compare several models. Plus, if you’re familiar with Excel and XLSTAT formulas, you can run XGBOOST directly in an Excel cell!
How to Install the Update?
This new version will give you access to all the new features mentioned above. The installation of our new version is recommended for all users.
If you have a valid XLSTAT license with access to maintenance and upgrades, you can download the new version for free.
If you are currently using our trail version, you can purchase a license of XLSTAT to access these new features.
Never tried XLSTAT before? Download your free trial today!
Project management is a multifaceted endeavor that involves careful planning, resource allocation, risk assessment, and timely execution. Despite thorough planning, unexpected issues can arise that can derail project timelines and devastate budgets. But with Monte Carlo simulation, a powerful technique that calculates all possible scenarios and the probability they will occur, project managers can anticipate and address potential roadblocks.
Monte Carlo simulation is a computerized, mathematical technique that runs thousands of simulations with random variables, pulled from either historical data or expert opinion, to determine the range of outcomes for a scenario. This technique can be leveraged in project management for planning project tasks, completion times, task durations, point estimating, and best- and worst-case scenarios in industries such as transportation planning, defense, aerospace engineering, and construction.
In this article, we'll delve into ten common project management issues that can be effectively anticipated and mitigated using Monte Carlo simulation.
Accurate project estimation is a fundamental challenge. Monte Carlo simulation can help project managers by incorporating uncertainty into estimations, generating a range of possible outcomes, and providing a probability distribution of project completion dates and costs.
Resource availability can fluctuate during a project's lifecycle. By simulating various resource allocation scenarios, project managers can identify potential bottlenecks, plan for contingencies, and allocate resources optimally.
Project tasks are often interdependent. Monte Carlo simulation can model the relationships between tasks, assessing how delays in one task might affect others. This helps in understanding critical paths and potential delays.
Scope creep can cause projects to veer off track. By simulating the impact of scope changes, project managers can determine how modifications might influence project timelines, budgets, and overall goals.
5. Uncertain Duration of Tasks
The time it takes to complete a task can be uncertain. Monte Carlo simulation can simulate various durations based on historical data – providing a clearer understanding of possible project durations and helping in setting more realistic deadlines.
External factors like weather, market conditions, or regulatory changes can impact a project. By incorporating these variables into simulations, project managers can evaluate their potential effects and devise contingency plans.
Risks are inherent in any project. Monte Carlo simulation enables the quantification of risks by assigning probabilities to various scenarios – allowing project managers to proactively address potential issues before they escalate.
Financial constraints can challenge project execution. Monte Carlo simulation can help project managers estimate budget variations – aiding in the allocation of financial resources and identification of budget buffers.
Effective communication is vital for project success. Using simulations, project managers can visually represent complex scenarios to stakeholders – enhancing communication and facilitating better decision-making.
In situations where multiple options or strategies are available, Monte Carlo simulation can aid in making informed decisions by quantifying the potential outcomes of each option, thus supporting strategic choices.
Monte Carlo simulation is a valuable tool that equips project managers with the ability to foresee and manage various challenges that can arise during a project's lifecycle. By incorporating uncertainty and variability into the planning process, project managers can make more informed decisions, set realistic expectations, and devise effective strategies for tackling unexpected issues.
As the realm of project management continues to evolve, embracing techniques like Monte Carlo simulation can significantly enhance the likelihood of project success in an increasingly complex business landscape.
While it’s clear that applying Monte Carlo simulation to your project schedules can help you plan contingencies and meet deadlines, the next step is implementation – and it’s easier than you might think!
Designed with project managers in mind, Lumivero’s ScheduleRiskAnalysis (included in DecisionTools Suite) analyzes schedule risk using Monte Carlo simulation with @RISK – all within Microsoft Excel. ScheduleRiskAnalysis lets you apply risk modeling on project files created in Microsoft Project and Primavera P6 while preserving the integrity of your original project model. This streamlined setup makes it easy to start applying Monte Carlo simulation to your project schedules right away.
Additionally, ScheduleRiskAnalysis features probabilistic Gantt charts that clearly display the likelihood of task durations and finish dates and critical indices for identifying the most important variables in your schedule to help incorporate robust risk analysis to your project schedules like never before.
With ScheduleRiskAnalysis, you can quickly identify new opportunities, avoid unseen schedule delays, understand critical factors, and clearly communicate risks to team members — solving project manager issues before they arise. Learn more about how Monte Carlo simulation can help improve your project management by requesting a free demo of ScheduleRiskAnalysis, included in DecisionTools Suite.
Learn about Dr. Bhattacharya’s qualitative research in the field of decolonizing and gain insights into balancing research, mentoring, and supervision.
In the realm of research, it's a well-known fact that qualitative research holds a distinct position as it is complex, theoretical, and abstract. As we delve deeper into this field, we realize the depth and the labyrinthine structure it offers. Our latest podcast episode provides you with a journey through this maze, guided by Dr. Kakali Bhattacharya, Professor at the School of Human Development and Organizational Studies and Education at the University of Florida.
Dr. Bhattacharya's work in the field of decolonization in qualitative research is not just impressive but also groundbreaking. Her passion was sparked by Ruth Behar's book, Translated Woman, which opened her eyes to a different dimension of qualitative research.
“She (Behar) was talking about how this process of translating somebody's stories translates the storyteller as well, and the stories that she was sharing were so powerful and also so reflective of the social conditions that affect certain groups of people in a certain kind of way that I was moved. I was in tears, and I didn't understand that that was research,” explained Dr. Bhattacharya.
As Dr. Bhattacharya started work on her dissertation, she was greatly influenced by Linda Tuhiwai-Smith's work on decolonizing methodologies and conversations with her colleague, Violet Johnson. After being encouraged to decolonize her mind and methodology, Dr. Bhattacharya dove into reading post-colonial scholars’ work. This research led Dr. Bhattacharya to the term “D/colonizing” to explain the complex movement of transnational diasporic groups between the present and utopic dreaming.
“I felt like decolonizing work is for a transnational diasporic group of people is always a shuttling. It's a shuttling between where we are now to where we might want to imagine, without being in any relationship with any colonizing structures, discourses, materiality, so that utopian dreaming as a decolonizing strategy and then current negotiation was how I was situating that idea. And so I slash the word and call it D-slash-colonizing,” said Dr. Bhattacharya.
Dr. Bhattacharya continues on to describe the complexities of the phrase D/colonizing.
“I've kind of looked at like what's in the slash, what's in this in between places, what's in the movement back and forth, because there is no more fluid and no more pure colonizing spaces and no more pure decolonizing spaces. So, we're always moving through multiple types of consciousness of our own colonization and our own resistance to it. So, I wanted to make it messier than like a clean, pure thing. So that's how I got into the D-slash-colonization,” said Dr. Bhattacharya.
As we delve further into the conversation, Dr. Bhattacharya shares the challenges and goals of qualitative research.
“Linda Tuhiwai-Smith first alerted me that research is an exploitative enterprise. It's a colonizing enterprise. It has history of doing some very bad things. You know, mostly from Western researchers, and in qualitative research particularly, there are some new movements that have started that I haven't been able to align myself with,” said Dr. Bhattacharya.
To ensure she conducts her research in an ethical way, Dr. Bhattacharya developed a guide for her research relationships.
“I really needed to have something that I could use, and it could also illuminate a path of ethics, of relationality with the people that I work with as participants or co-researchers,” said Dr. Bhattacharya. “I have argued that you posture to give up your will to know. … You're not owed anything. You enter the space with humility to learn what is being offered to you by folks, but you're not owed anything."
Continuing the topic of a guide for herself and others, Dr. Bhattacharya gives us a glimpse of her unique mentoring approach, which encourages students to center their identity while fearlessly breaking the traditional rules of dissertations. She has penned a book, Fundamentals of Qualitative Research: A Practical Guide, 2017, that serves as a profound resource for readers interested in applying qualitative research in their work.
“One of the things that I want to do when I mentor graduate students is to make sure that ... their voice is not silenced but, you know, sharpened, amplified and brought to bear on their work instead of telling them that they need to rigidize themselves into a certain academic voice,” said Dr. Bhattacharya.
Regarding her book, Dr. Bhattacharya explained that she wanted to give people who knew nothing about qualitative research a practical guide with exercises that, by the time they finished the book, they would have a decent understanding of how to meet qualitative research in their own work.
The final segment of our discussion focuses on Dr. Bhattacharya's personal journey juggling research, teaching, and supervising. She discusses actively working to improve her balance more recently due to an autoimmune disease and learning to tune in to her body and energy.
Dr. Bhattacharya describes giving all of her time to writing, researching, and helping students – occasionally to her detriment. Now, she has designed her classes in an efficient way that allows her to assist students while not burning out.
Throughout the episode, Dr. Bhattacharya stresses the importance of unlearning and relearning. She shares that her research has evolved over time and has taken a deeper dive into D/colonizing. She emphasizes the importance of centering one's identity, unlearning traditional research rules, and fostering relationships with research participants.
“First, figure out why are you drawn to this work. … Not to just say that because I want to save people, but really figure out like when was the first time you became interested in this? Why are you interested in this? And try to figure that out. Not for academic purposes, but for the purpose of your being. ... Don't chase currency, chase your purpose,” concluded Dr. Bhattacharya.
The beauty of this episode lies not just in the insights shared but also in the candid experiences and practical tips that Dr. Bhattacharya offers. She shares her journey, her struggles, and her triumphs – making it a truly enlightening exploration. Whether you are a student, a researcher, or a professional, this episode promises a deep understanding of qualitative research and its intricate nuances.
Embark on this journey with us and let Dr. Bhattacharya guide you through the intricate maze that is qualitative research in Episode 56: Chase Your Purpose, Not Currency.
Effortlessly improve your decision making through risk modeling and analysis with @RISK software – a powerful add-in tool for Microsoft Excel. By using the Monte Carlo simulation technique, @RISK computes and tracks all possible scenarios in your risk model, and the probability each will occur, to help you judge which risks to take and which to avoid.
@RISK continues to improve to meet your needs in this uncertain world – now featuring an export option for Gantt Charts, upgraded Time Series modeling, and improved markers display for clear data interpretation.
With this update, you can now:
Learn more about the new release below and update your @RISK software today to benefit from the new features.
Available for ScheduleRiskAnalysis with DecisionTools Suite
Clarify communication within your team to help make data-driven decisions with the new export option for Gantt Charts in ScheduleRiskAnalysis. You can now export Gantt charts to PDF files for any custom time range and data scale, preview the report, and create it in seconds – increasing your ability to communicate project schedules and timelines to team members, stakeholders, and clients.
Adapt your analysis to your data and its characteristics with the improved Time Series feature. With this new update, you can choose to synchronize the time series models you need to fit with the i-th value option.
Enhance visibility and ease of data interpretation by effortlessly highlighting key information from your charts. Markers are now automatically displayed horizontally on cumulative distribution function charts to facilitate quick insights and informed decisions.
This new version will give you access to all the new features mentioned above. The installation of our new version is recommended for all users.
If you have a valid @RISK license with access to maintenance and upgrades, you can download the new version for free by opening @RISK and checking for updates from the menu.
If you are currently using our trail version, you can purchase a license of @RISK to access these new features.
Never tried @RISK before? Download your free trial today!
As 2023 progresses, manufacturing companies continue to focus on supply chain management issues. In an April 2023 survey conducted by CNBC, only 36% of supply chain managers said they expected inventories to return to normal by year’s end. The New York Federal Reserve’s Global Supply Chain Pressure Index, which collates data from a range of indicators including air freight and shipping costs, began rising again in July after dipping to a historic low in May. With the price of materials and storage still fluctuating, optimizing production processes through reliability engineering can help manufacturers reduce cost pressures elsewhere.
Reliability engineering is a sub-discipline of systems engineering that lets manufacturers fine-tune the dependability of a production plant, process, or finished product, showing how failures or waste in production can affect costs or profits. Predictive analytics solutions that use Monte Carlo simulation can help manufacturers develop reliability engineering models that allow them to effectively manage risk.
Because Monte Carlo simulation can account for a wide range of variable factors, including random chance, it can be tailored to each manufacturer’s circumstances and needs, such as improving supply chain management through optimization of raw material usage or production capacity. This article looks at two examples of manufacturing companies that have made use of Lumivero’s @RISK and DecisionTools Suite to improve their reliability engineering and risk analysis with Monte Carlo simulation.
Trial runs for new manufacturing processes are a necessary step in research and development — a necessary step that can also be very costly. For Met-Mex Peñoles, cost is even more of a concern. The company is one of the world’s largest silver and zinc refiners and leads Latin America in the refining of gold and lead. Running refinement process trial runs with real precious metals could become very expensive very quickly.
Met-Mex Peñoles uses the Six Sigma Design of Experiments framework (DoE) to develop its trial runs. The DoE framework allows teams to investigate how processes can vary based on:
According to Met-Mex Peñoles technology manager Ignatio Quijas, Lumivero’s DecisionTools Suite has played a major role in simulating their Six Sigma DoE trial runs — particularly the Monte Carlo simulation tool @RISK. “Using @RISK to simulate changes in process design allows us to answer some difficult questions without actually running trials,” Quijas told Lumivero.
Using @RISK, his team can input historical data about everything from physical measurements and processing errors to manufacturing equipment tolerances and cost analyses. Plus, the TopRank tool and @RISK’s distribution-fitting feature makes it possible for Quijas to generate graphic representations of simulated trial outcomes that he can use to communicate findings to management teams. The result is less wasted precious metal in trials — and reduced costs for improving their processes.
Met-Mex Peñoles was able to develop its reliability engineering simulations based on historical data. However, what happens when manufacturers don’t have data to use as inputs for their predictive analytics tools? Hungarian risk analysis firm SzigmaSzervíz Ltd. developed a risk-quantification method to deal with just this situation. Their IntegRisk method integrates a scenario analysis process that helps generate input data to inform Monte Carlo simulation models.
Tasked with helping a client understand the potential profit impact of a new plant maintenance process, SzigmaSzervíz asked the project managers to prepare a schedule of activities that would be carried out during the maintenance work. These activities were modeled in Microsoft Project. Assuming no delays, the scheduled maintenance work would take 26 days. Then, managers and team leaders began the scenario analysis. They conducted a series of workshops in which they evaluated each maintenance activity for risk impacts — that is, for any event that could affect the projected duration of the maintenance period — based on their previous collective experience with other maintenance shutdowns.
Using data generated by the company’s scenario analyses, SzigmaSzervíz then ran Monte Carlo simulations with @RISK to generate a lognormal probability distribution. This probability model helped identify both the most likely and the most potentially damaging impact events — for example, a lack of engineering capacity to carry out electrical circuit maintenance activities that could extend maintenance completion time by as much as five days. Then, SzigmaSzervíz developed a tornado graph to show potential delays caused by the most likely impact events and determine what the overall delay to the maintenance timeline could be. The model showed a mean delay of 4.258 days to the original 26-day schedule.
Based on this outcome, SzigmaSzervíz was then able to develop another model that analyzed the potential profit loss that could result from a maintenance shutdown that lasted 30.258 days instead of 26 days. Their findings showed that this delay could cost the manufacturer as much as €5 million. The manufacturer was able to develop risk treatment plans that kept maintenance time close to projected deadlines, ultimately reducing profit loss.
Improve Your Reliability Engineering with Monte Carlo Simulation
Probabilistic analysis tools like DecisionTools Suite can provide manufacturers with a risk analysis process that can improve reliability engineering for production and maintenance processes. Our free example models for different scenarios can help you understand more about how Monte Carlo simulation can help your organization. Download the models today:
If you’re wondering ‘what is a literature review’ or trying to figure out how to write a literature review, you’ve come to the right place. While a literature review can be a summary of sources, it can also discuss published information in a variety of formats on a specific subject area and tends to have an organizational pattern that combines both a summary (a recap of the information) and a synthesis (a re-organization or the information).
The literature review for your article, thesis, or dissertation requires keeping track of sources, their important points, and their links to each other – for hundreds of journal articles, books, research papers, videos, scholarly articles, and other references. So, it’s no surprise grad students and researchers frequently struggle with how to write a literature review.
Many university guides on the subject recommend creating a synthesis matrix for keeping track of sources, ideas, and quotations. Traditionally, this matrix was often created as a Word document, and you’ll still find many templates available online. However, more and more academics now seem to be using spreadsheets instead.
This blog post will look into the advantages and disadvantages of using Excel and Word, explore the reasons for why researchers use spreadsheets, and discuss the benefits of using a specialized writing and reference management program like Citavi.
Proponents of the Excel approach are quick to tout the many benefits. First, there’s no need to pay for a new piece of software, since if you already have Microsoft Office installed on your computer, you also already have Excel. Otherwise, you can also use Google Sheets which has all the options you might need.
Then, there’s the simplicity and flexibility of using a spreadsheet. Set up time is pretty low. You simply create a few columns and can get started using your literature tracking system in a matter of minutes.
Another benefit is how easily customizable the solution is – you can make the categories be exactly what you want. Need a column to track the location of a study or a specific intervention? You just need to add it. Even though Excel can get complicated if you set up formulas or other customizations, for a literature review spreadsheet you usually can just use it as a simple table.
So far, the advantages listed apply to Word as well, but Excel and Citavi have one crucial advantage over Word: it lets you search, sort, and filter. Have a vague recollection of a note you wrote but only remember one term you used in it? Use Excel’s “Find” feature. Want to sort all your notes by year of publication of your source? Nothing could be easier than sorting your “year” column in ascending order. Want to find clinical trials with female participants with a statistically significant intervention? If you set up your Excel sheet as described below under “Version 2” such combinations of queries are possible, and in Citavi, setup is even easier as it lets you save sources directly into the program and organize your literature review outline in the knowledge organizer.
Citavi interface showing outline, sources, reference meta data, and an article PDF.
So, with all these advantages, how does the Excel method work in practice?
When you search for “Excel literature review”, Dr. Elaine Gregersen’s 2016 blog post “How I use Excel to manage my Literature Review” about her personal literature tracking system is one of the first results to pop up. It’s an approach that’s still often praised in discussion threads about Excel literature tracking methods. In her own words, it’s a simple approach, but that’s what makes it work. Her approach uses a literature review spreadsheet in addition to a reference manager. She uses one sheet only and includes columns for basic citation information, keywords, objectives, methods, and conclusions. In addition, she adds in four personalized categories: happy thoughts, unhappy thoughts, her own ethical concerns, and the author’s ethical concerns. These last two columns perfectly align with her field of Autoethnography. The happy thoughts column is for notes, such as how findings relate to her own work, while the unhappy thoughts column is for times when she disagrees with an author, among other uses.
Dr. Raul Pacheco uses a similar one-sheet method, which he calls the Conceptual Synthesis Excel Dump (CSED) technique since he tosses in any literature he might be using for analysis. His setup overlaps in some ways with Gregersen’s but has a few differences; he has columns for the concept (i.e. theme), citation, main idea, three columns for notes (which function similarly to Gregersen’s happy and unhappy thoughts), cross-references, quotes, and page numbers.
A useful tip is to create a dedicated column for quotations to help separate out the authors’ exact words from one’s analysis of them or the article as a whole. This can help you inadvertently misrepresent an author’s ideas as your own when you’re later writing your literature review.
Taking the models laid out by Gregersen and Pacheco as a jumping off point, it’s easy to make some tweaks for even better usability for your own projects. Obviously, you’ll want to create columns that fit your needs. Instead of a column “main theme” you might have several “key takeaways” columns. Or a highly-personal column for how each article relates to your own work. For example, you might include only the author names and year of publication for an article rather than the full citation (in which case we’d highly recommend saving the full details in a reference management program!). Some people might want to copy the abstract the authors provide, while some will choose to write their own summaries. You can add “notes” columns or distinguish between paraphrases, comments, and direct quotations. Beyond that there are a lot of other small things you can do to make your spreadsheet work better for you, such as linking from a citation to the actual PDF, adding comments to cells, or adding drop-down lists to make data entry easier.
If you struggle with organizing your notes and memos, you could benefit from a reference management software like Citavi. Citavi lets you make notes within the program and easily connects your notes, memos, and quotes to your sources – helping you keep track of all your thoughts and research.
In Citavi, see all your notes and comments about a source in one place.
If you want to take your basic Excel spreadsheet up a notch, you can do so in several ways. For one, you can make use of multiple sheets in the same workbook. Dr. Kathleen Clarke describes her method which involves a major spreadsheet for tracking all the high-level information about a source along with minor spreadsheets which are more granular. She describes her method as a mix between Gregersen’s and Pacheco’s, but she also includes additional sheets on different but related topics and for studies she wants to read later on. One other notable addition is the use of a numbering system for her sources which corresponds to the article file names on her computer.
While there’s a lot of freedom in how you set up your Excel files, there are still some best practices you’ll likely want to follow. First, you should set up your table so that headers are marked as such. This way they won’t be sorted along with the other cells if you sort the column by A-Z, for example. Also, you’ll want to apply word wrap formatting to cells to keep content from spilling over into neighboring empty cells. This just keeps everything looking a lot tidier and makes it easier to skim through. Another handy option recommended by McQuilliam is to set up endless scrolling which keeps your column headers visible, even when you’re adding entries at the bottom of your list.
The columns you include are more or less up to you, but you’ll need a column for source information for sure to avoid inadvertent plagiarism or having to hunt down sources later on. In addition, a year column is invaluable for sorting your literature chronologically in preparation for writing your lit review. To keep track of how authors build upon and discuss each other’s work, a cross-references column can also be helpful. It’s important to make it very clear which analysis and thoughts are your own and which are those of your author.
If you’re planning on using filter features later on to search by study type, keyword, or some other criteria you’ll need to use controlled vocabulary, i.e. each concept should be referred to by a single term rather than using a bunch of different synonyms. You can define this at the start in a key on a separate sheet of your Excel workbook so that you can easily refer to it as needed. Each time you decide to add new terms, just add them to your key.
To save time, a streamlined option for organizing and categorizing your source information, notes, and quotes is Citavi, and we’ll look further into the benefits of using Citavi at the end of this post.
It’s hard to argue with the advantages of ease, simplicity, and flexibility that the Excel method gives you. But, there are still some big downsides to consider.
First, you have to set everything up yourself – it’s not already set up for you in a way that should fit most workflows. If you try something and later decide to take a different approach, you may need to go back and add in additional information for many sources you already examined.
Although search, filtering, and sorting options in Excel are much better than they would be in a Word table, the program is still a spreadsheet at heart which means that it’s “flatter” than a database. In other words, it’s less relational which makes it difficult to create complex search strings to get a subset of items that fit multiple criteria or that use more complicated search techniques such as Boolean logic or wildcards.
Another drawback is that the Excel approach involves a lot of manual entry. While some amount of manual work will always be necessary, for example, when you type up your comments or key takeaways, you won’t be able to directly extract information from PDFs (such as direct quotes or images) without using an additional PDF reader. Moreover, there are no time-saving automation options for adding source information that you might be accustomed to from your reference manager.
Speaking of reference managers, in many of the Twitter discussions around the Excel note-taking approach, there will always be a few comments asking why the person didn’t consider using their referencing software for their notes. Many proponents of the Excel approach stress that they do indeed use a reference management program to keep track of their source information but that they prefer to keep their notes and analysis in a separate Excel file. One of the reasons is that even though many reference management programs let you group references into folders and tag them with specific terms, they don’t let you easily keep track of and categorize notes on a particular source. You basically get a single notes field and that’s it. No way to categorize, group, or tag the note itself, just the source as a whole.
While this is true for many reference manager programs, there’s one that goes above and beyond its competitors – Citavi! While we’ve explored how it’s possible to create a literature review with Excel and Word, it is not the most efficient way available. With Citavi, you can easily keep track of, categorize, and connect your sources – all in one place.
Citavi is a reference management program that has been designed with extensive knowledge organization for any number of sources in mind and may, in many cases, be a better alternative to the Excel method.
Citavi lets you automatically add source information for most journal articles. Then, you can read PDFs and save notes and memos directly in the program. Annotating in Citavi is as simple as how you would on paper as you can highlight sections of text in colors that indicate whether it’s an important section, a section you might want to cite, or a passage that you’d like to analyze more closely. The only difference from annotating on paper is that these notes – which can be summaries, indirect quotations, direct quotations and comments – are always linked directly to their location in the PDF, so if you ever have to look up the context for one of your own comments or a direct quotation again, one click takes you directly to where you need to go and makes it easy to create your annotated bibliography.
Page numbers are saved automatically, as long as the PDF metadata includes that information. Otherwise, you just need to enter a page number for an article with the first “knowledge item” you save for it. Citavi will then add all the rest automatically.
Citavi keeps track of your meta data so it’s easy to follow one of the hundreds of citation styles available in the program.
Although the knowledge item types are pre-defined, the many options will fit most needs, and you can also always use either the keywords, categories, or the core statement field to designate the type of note you are adding if you want more customization. Any terms you use can later be searched or used as filters (more on that below). In addition, for the reference as a whole you also have pre-defined fields for keywords, groups, evaluations, abstracts, notes, and cross-references. This lets you classify at both the reference and note level, so, if you want, you can assign different categories or keywords for a source as a whole and for a statement you find in it. If you need additional source fields, there are nine custom fields which you can rename and format with drop-down options.
Where Citavi really shines against Excel is in its search features and integration with Word and NVivo 14. You can create and save complex searches that combine or exclude certain terms, keywords, categories, note type, year, etc. You can make use of advanced search syntax, such as boolean operators, wildcards, and regular expressions. You can rate sources and filter by rating. And, you have full-text search across all of your PDFs.
You can also view project statistics at a glance or use an add-on to do an analysis by author or another criteria. With Citavi and NVivo 14 integration, you can go beyond reference management by creating a springboard to collect references and thoughts, analyze literature, and connect empirical data with NVivo’s analysis tools – helping you dig deeper into your research and speed up your publishing time.
But the best part is that all of this information can be taken directly over to Word. You have all the analysis and quotes you’ve saved in a panel at the left and can just click to insert what you need. Citavi will insert the correct citation formatting and add an entry to your bibliography at the end. If you added your notes to an outline in Citavi, you can use the “Chapter” view to focus on what you need for a particular section. And, if you ever need to double-check the context for a direct quotation or your own paraphrase, you can click a link symbol to jump back to the exact spot in the PDF that you referred to.
If you do need to at some point export your reference information in table format for an appendix in your dissertation (for example, as documentation of the exclusion process for a systematic review), doing so just requires a few clicks. If you’ve previously worked with Excel and want to try out Citavi, importing is just as easy, and you can of course import all of your existing notes as knowledge items.
Last but certainly not least, if you use Citavi, you have the benefit of working with one tool instead of needing to juggle an Excel spreadsheet, a reference management program, and a PDF annotation tool or PDF reader.
We think it’s a no-brainer to use Citavi instead of Excel or Google Sheets to keep track of your reading for a literature review – but then again, we might be ever so slightly biased. What do you think?
Learn more about Citavi or request a free 30-day trial today!
Lumivero Support Center Now Live on the Lumivero Community Site
We’re excited to announce a powerful new customer support center for NVivo, Citavi, and Sonia located on the Lumivero Community site. Our modern support structure now features a user-friendly design, improved communication with the support team, and access to the knowledge base – all in one place!
With the new support center, you won’t have to depend solely on email for updates on your case tickets. Now you’ll be able to see the status of your cases, access the conversation history of all your tickets on one page, and easily reference past tickets as your full case history is saved to your profile.
Learn more about how you can create and follow your support ticket, receive alerts for your cases, enjoy improved communication with the support team, and access the Knowledge base for NVivo and Sonia (Citavi coming soon) below:
Keep all your support cases in one place in the support center! Here you’ll be able to review case statuses, access all open and closed cases, and “follow” your cases which lets you receive alerts when there’s been a progression in your case.
To create a new case, simply choose the product you wish from the drop-down list of case forms. Your account information will already be populated – making the new case submission process shorter.
Easily communicate with the support team by accessing all your messages directly in the support center – allowing for a two-way conversation that doesn’t get lost in your inbox. Further improve your communication with the ability to upload videos and screenshots to easily show the issue you’re experiencing all on the same case.
Dig into our extensive knowledge base to find answers to your questions and discover new ways to uncover insights from your data. Plus, easily access articles on product updates and service packs (coming soon).
Access our library of how-to videos, watch expert presentations, and discover more ways to learn how use NVivo, Citavi, and Sonia.
Plus, as part of the Lumivero Community site, you’ll have easy access to product resources, user discussion boards, and the upcoming events calendar that can help you dive deeper into your data.
Start improving your experience with NVivo, Citavi, and Sonia by checking out the new Lumivero Support Center in the main menu of the Lumivero Community and by accessing all the amazing insights provided by real user members of the community today.
Join for free today to get access to the Support Center, connect and engage with fellow users and experts, and learn and share insights that expand your network and enhance your user experience. From the login screen, simply click ‘not a member’ and sign up today.
Join the Lumivero Community (Button linked)
Develop your data analysis skills and connect with experts across industries at the free, Lumivero Virtual Conference! The conference spans two full days from Sept. 27-28, features eight industry-leading keynote speakers, and is designed for anyone who wants to dig up deeper insights from their data.
Learn how to take your qualitative and mixed methods research to the next level with more than 70 hours of presentations, research and product sessions, and real-time Q&As – all discussing proven methods for reporting, presenting, and organizing data and innovative ways to manage student field experience programs.
“The Lumivero Virtual Conference is an amazing opportunity for researchers to meet, learn, and network with experts and other Lumivero software users around the world,” said Stacy Penna, EdD, Lumivero Growth Marketing Director.
Register for the Lumivero Conference
Attendees are encouraged to customize their learning experience by selecting one of seven conference tracks that will highlight solutions to the following topics:
Plus, all sessions will be recorded and made available to registrants for up to 30 days following the conference so you can control your learning experience.
Greg C. Ashley, PhD, Program Director of Ph.D. in Human Capital Management, Professor
Building Student Self-Efficacy and Appreciation for Data-Intensive Coursework
Kakali Bhattacharya, PhD, Professor in Research, Evaluation, and Measurements Program, University of Florida
Data Analysis is Relationship Analysis: De/colonizing Points of Consideration in Qualitative Research
Divya Bheda, PhD, Education Consultant, Divya Bheda Consulting Services: Capacity-Building for Change and Equity
An Equity Framework Offering: Strengthen Your Practice as a Leader, Educator, and Decision-Maker.
Dana Linnell, PhD, Assistant Professor, Psychology Department, Program Director, MS Applied Psychology, University of Wisconsin-Stout
Improving Your Survey Questions to Improve Your Results
Johnny Saldaña, Professor Emeritus, Arizona State University
Designing Conceptual Frameworks for Qualitative Research Studies
JD Solomon, PE, CRE, CMRP, Founder of JD Solomon, Inc.
How to Effectively Communicate @RISK and DecisionTools to Senior Management
Karsten Zegwaard, Director of Work-Integrated Learning Research, University of Waikato, New Zealand
Current Trends and Challenges for Quality Work-Integrated Learning in the Higher Education Curriculum
Sharon Zumbrunn, Ph.D., Professor, Virginia Commonwealth University
Embracing the Writing Feedback Process
Learn from our product trainers on how to get the most out of your Lumivero software: Citavi, @RISK, DecisionTools Suite, NVivo, Sonia, Tevera, and XLSTAT
How to Register
It’s easy! Simply complete the free, online registration form with either your Zoom account or email before Sept. 27 to access all the live sessions, plus content for up to 30 days after the virtual conference ends.
If you’re struggling with deep sensory data analysis or spending excess time adapting your sensory data format, these new XLSTAT features are for you!
The latest edition of XLSTAT has been optimized to save you time and take your sensory analysis to the next level. Quickly use multi-block functions such as STATIS and MFA with our new sensory data preparation feature and uncover new insights with the multivariate JAR (Just-About-Right) analysis and RATA (Rate-All-That-Apply) data analysis. Discover how XLSTAT’s new features and improvements will enhance your sensory data analysis with the highlights below.
JAR (Just-About-Right) Multivariate Analysis and Clustering
New Feature Included in Sensory and Premium
Get all the relevant information from your JAR (Just-About-Right) experiments to help you go beyond penalty analysis and use your JAR data to describe your products, find similarities, compute homogeneity indices, and build a clustering of assessors. While JAR data is often only processed with penalty analysis, it can be used to describe products, see similarities and differences, and more!
With JAR, you can perform a multivariate analysis which creates a map of products and their descriptions and create an analysis of the homogeneity of responses for information about the quality of your data. Plus, all this is handled by pre-processing followed by an enhancement of the CATATIS method. If several groups of subjects with different points of view exist, an option allows these groups to be constructed automatically, thanks to an enhancement to the CLUSCATA method.
RATA (Rate-All-That-Apply) Data Analysis
New Feature Included in Sensory and Premium
Analyze your RATA (Rate-All-That-Apply) tasks efficiently with a new function called RATA data analysis in the sensory analysis menu. With just a few clicks, you can test the discriminability of your attributes, the homogeneity of your subjects, or build a product/attribute map to see how your products are characterized by a single feature! Plus, you can check for repeatability and test the homogeneity of subjects globally and by attribute.
These functionalities were presented at Pangborn 2023 and come from our XLSTAT team’s research.
Create a Products/Assessors Table
New Feature Included in Sensory and Premium
Save time by automating your preparation of sensory analysis data! As data is often structured vertically, where each line describes a subject/product combination, it can be time consuming to switch to a block structure which is required for certain multi-block methods such as MFA, GPA, STATIS, and CLUSTATIS.
Now you can easily switch from vertical mode to horizontal mode in just a few clicks to use these sensory features. This feature can even be used before your projective mapping analysis!
DOE (Design of Experiments) for Sensory Data Analysis
Improved Feature Included in Sensory and Premium
Following expert recommendation, the DOE function for sensory analysis can now easily generate random e-digit codes to make your products anonymous.
How to Install the Update?
This new version will give you access to all the new features mentioned above. The installation of our new version is recommended for all users.
If you have a valid XLSTAT license with access to maintenance and upgrades, you can download the new version for free.
If you are currently using our trail version, you can purchase a license of XLSTAT to access these new features.
Never tried XLSTAT before? Download your free trial today!
The increased costs, lack of reliable transportation, and scarcity of supplies all clearly tell the story that’s reigned on news feeds since 2020; The global manufacturing sector is still adjusting to the disruption of the COVID-19 pandemic and the war in Ukraine and there remains high supply chain risk. A September 2022 analysis of the supply chain crisis by the recruiting firm Randstad reported that almost three years after the beginning of COVID’s first wave, materials were still slow to ship, energy costs were still high, and labor shortages remained a persistent issue. In fact, a 2021 study by Deloitte and the National Association of Manufacturers projects that firms in the United States could have as many as 2.1 million unfilled vacancies by 2030.
Given this uncertainty, it’s no surprise that 93% of supply chain executives surveyed by McKinsey in 2021 said they were planning to develop more agile and resilient supply chains. A few steps that companies have already taken include regionalizing supply lines, diversifying their suppliers, and relocating production closer to customer bases. It also means more attention to operations risk management. In the same McKinsey survey, 59% of supply chain executives said they had adopted new risk analysis and management practices over the previous 12 months.
"With climate change and geopolitical tensions expected to impact food and cause disruptions, now is the time for supply chain leaders to take initiative and be creative as to how they can invest and improve their operations,” said Dr. Madhav Durbha, Vice President of Supply Chain Innovation at Coupa in a PR Newswire article.
Predictive analytics tools can play an important role in helping manufacturers understand risks to supply chains, discover creative solutions, and take steps to optimize their operations. Lumivero’s @RISK and DecisionTools Suite use Monte Carlo simulation, a computational technique that calculates the probability of all possible outcomes based in uncertainties in the input values. In this article, we’ll look at how two organizations have improved their supply chain risk management practices and overall operations using the Monte Carlo simulation models generated by @RISK and DecisionTools Suite.
Hitachi Solutions East Japan – Driving Better Supply Chain Risk Analysis with @RISK Developer Kit
Hitachi Solutions East Japan, Ltd. is a software and systems engineering firm that also oversees the installation of computer hardware systems for manufacturing companies. One significant challenge Hitachi faces is in providing their customers with a tool for guiding informed decision-making about manufacturing operations based on fluctuations in material costs, foreign currency exchange rates, and consumer demand. With @RISK Developer Kit , they found the solution they needed to help avoid pitfalls and make decisions with confidence
@RISK’s Monte Carlo simulation modeling allowed Hitachi’s planning team to generate simulations that showed all possible outcomes of and the likelihood of adverse events that could impact operations. However, @RISK’s tools run in Microsoft Excel, and Hitachi wanted a more visual graphic user interface (GUI) that could incorporate @RISK analyses.
With the @RISK Developer Kit and C++ programming tools, Hitachi’s team developed a new user dashboard that pulls in data from a wide range of their customers’ systems including production data, sales data, inventory, and more. This interface allows customers to understand how producing more or less of a product can potentially impact profitability and helps them decide how to prepare facilities or adjust operations to mitigate the impact of risk.
Novelis – Visualizing Manufacturing Risk Analysis to Optimize Operations
Novelis is the largest aluminum rolling and recycling company in the world. Its aluminum is found in everything from beverage cans and buildings to auto components and aircraft. With a global footprint that covers 33 facilities in nine countries, Novelis is constantly developing new products to serve its customers—and new production processes to improve its existing products.
Until recently, this multinational corporation lacked a qualitative risk analysis process. Its Research and Technology team instead relied heavily on information from the commercial side of the business when launching new products or production methods. With @RISK, the Novelis team was able to draw on data from the scientific branch as well.
Armed with testing data from the R&D team, Novelis’s Senior Manager of Innovation Strategy, Dave MacAdam, was able to analyze the risk involved with changing to a new technique intended to improve the efficiency of aluminum recycling. With Monte Carlo simulation in @RISK, MacAdam calculated the probability of certain failure modes based on changing variables in the process. Sensitivity analysis was then used to generate tornado charts ranking the effect of the input variables on the outcome. These graphs allowed MacAdam to show executives that most of the risk involved with the new recycling process appeared to coalesce around a few specific technical factors. The team was then able to commission more research into those factors and devise mitigation strategies.
MacAdam’s team also used PrecisionTree to develop models that showed decision-makers how risk compounds at each stage of the product launch process. By analyzing how one link in the production chain influences the next, decision-makers could take practical steps to adjust operations or request further testing from their researchers to reduce or retire risk from prototyping all the way to production.
Optimize Your Manufacturing Operations with Monte Carlo Simulation
Improve your understanding of supply chain risk to inform better operational decision-making with tools like @RISK, @RISK Developer Kit, and PrecisionTree. Explore our free models based on real-world manufacturing scenarios today: