Executive Summary

Central Michigan University’s Counselor Education program needed a new software partner that was more user friendly and eased the administrative burden on its program administrators. They found Tevera and haven’t looked back.

The Challenge

Sheri Pickover’s counseling journey started at the University of Michigan. After earning her bachelor’s degree in psychology followed by her doctorate, she started working in shelters and running counseling clinics. Her career led her back into academia where she landed at Central Michigan University in her role as a faculty member and program administrator.

When Sheri started, the program was using other assessment software. Because of the amount of manual labor and tedious tasks that went into using the system, Sheri began to search for a solution that would allow their program to grow more efficiently and keep up with their burgeoning student population.

As the person doing all of the program assessments, Sheri was frustrated that the software wasn’t as intuitive or forgiving as it could’ve been. For example, when she created something and then needed to change it, the program wouldn’t allow for those changes. It forced her to start all over again and rebuild everything for the users’ end.

So, Sheri started looking for a new software partner.

The Solution

Sheri had heard of Tevera while working at another university and decided to start exploring whether they might be a good fit for Central Michigan’s Counselor Education program.

She wanted to make sure that Tevera would:

But most importantly, Tevera needed to have a best-in-class customer support system.

After doing her research and taking Tevera for a test drive, Sheri was able to answer “yes” to all of the above questions and the program adopted Tevera in January of 2021.

"In Tevera, nothing is more than a few clicks away."

Sheri Pickover, Ph.D., LPC
Central Michigan University

The Impact

Right away, Sheri noticed how much easier Tevera was to use, how it helped streamline their internal processes, and a significant increase in how much support their students were able to get.

First, Tevera is user-friendly.

Tevera offers easy access and accessibility not only to administrators like Sheri but even supervisors, faculty members, and students.

Sheri is also a fan of the interface and data reporting systems. With Tevera, she only has to run two reports, “I have to run one just by program and then I can run by program and cohort and everything pulled out to organize.” 

Along with this, she has more control of the data with Tevera. She can easily log into the program and assess the data.

When problems arise, which they rarely do, Sheri notes that Tevera’s help desk is a great resource.

Next, Tevera has improved administrative processes.
Sheri estimates something that might’ve taken three hours in their old program now might only take thirty to forty-five minutes. 

Sheri also finds it much easier to communicate with students. Before Tevera, students had to email in their applications and files. Not only did this flood her inbox, but things got lost or students were missing information. Sheri no longer finds herself reaching out to students about gathering missing elements.

Finally, Tevera offers ease of use.

And how does Tevera serve their students?

The time tracking component is huge and students love how simple it is to use and that they can do it on their phones, making it easy to stay on top of everything.

As Sheri says about Tevera, nothing is more than “a few clicks away.”

REQUEST A DEMO

Here a Spreadsheet, There a Spreadsheet, everywhere a Spreadsheet!

Your students spend hours of time and effort keeping track of all their hard-earned experience, and you and your faculty and external site supervisors invest as much time trying to maintain cohesive and collaborative support of student field experiences. But in the end, everyone is left with their own siloed jumble of in-progress and finalized files and folders, and trying to determine whose to go by can be a messy digital- and looseleaf-mess!

To facilitate a collaborative time tracking process, your students need to:

With spreadsheets and other individualized tracking tools, even that basic process can devolve into three disjointed steps that require a whole lot of manual effort.

What you need is a single tool that is so straightforward and efficient that you, your students, and their supervisors can focus on what’s most important: the placement experience itself.

Here’s where Tevera steps in with Student Activity and Time Tracking. In our all-in-one Timesheets workspace, students can:

Innovating the Time-tested Method

Tevera–designed to be easily navigable, user intuitive, and functionally thorough–not only meets those core fundamentals, but exceeds them, elevating your student’s activity and time tracking:

Core Experience Tracking

Comprehensive Activity Tracks

Tevera works with universities to construct tracks that meet their programmatic needs. Our skilled experts can craft consistent and uniform tracks that are customizable to your program!

Time Track Progress Monitoring

Programs often require their students to meet specific guidelines and criteria as they work toward their degrees such as total hours and minimum hours requirements. Students, faculty, and site supervisors can monitor progress toward these criteria in real time using the Track Progress Display, keeping everyone in the loop throughout the student’s field experience!

Student Experience Reporting

Forget passing paperwork back and forth! In Tevera, time entries and timesheets can be reviewed and approved by supervisors electronically.

Time reports can be generated to show the hours tracked by students over a specified timeframe. These reports can then be sent to reviewers (site supervisors and/or program faculty) for signature. Designated reviewers have the option to either approve or return timesheets back to students.

If a time report is returned to the student, they will receive a notification to make the necessary corrections and resubmit it for approval.

Once all reviewers have signed the final report, students are left with a PDF document that will be stored in Tevera for life, or downloaded and printed as needed, giving them permanent access to their earned experience hours for licensure or job applications.

You can choose from a number of time report formats to facilitate this process. See a full list of time reports that can be run in Tevera here!

For California universities, don’t worry! We didn’t forget about you—Tevera coordinates with universities to monitor Board of Behavioral Science (BBS) changes to licensure requirements, so you’ll always have access to up-to-date BBS reporting!

Elevated Tracking

Tevera’s innovations to the core needs of experience tracking can already elevate some university programs, but Tevera strives to elevate all programs by including various efficiency features and entry sorting capabilities:

Streamlined Efficiency Features

Manual input of every entry into a log can be time consuming, but Tevera offers streamlined features to help provide efficiency to your student’s time tracking. Some of our efficiency features include:

Organize and Sort Time Entries

As students complete their placement experiences, they may need to review their past time entries for various reasons. With Tevera’s Agenda View, students can not only display all of their past time entries but also sort those entries using various filter options. Tevera can filter time entries using a single or multiple filters including but not limited to:

One Space, Many Functions

With Tevera, you cannot only ensure that your students are fulfilling their activity and hours requirements, but you can also ensure that they are making the most of their placement experiences by removing the time, energy, and effort needed to document them.

Help your students make the most of their education by scheduling an overview with Tevera!

REQUEST A DEMO

The phone calls, emails, paperwork, and negotiations are done! Your placement team has successfully placed all your students at external agencies for their practicum experiences in teaching education, social work, counseling education, and beyond. Now, they are gaining real-world experience under the supervision of qualified field instructors, but the work doesn’t stop here.

How can your program faculty remain connected to students and support their skill development while they are out in their placements? How can you ensure site supervisors provide high-quality feedback to students that helps improve their learning process? And how can you be sure each student in your program is meeting competency benchmarks, no matter how remote their placement?

With Lumivero, you can achieve all of this and more, guiding students toward their learning goals!

Enter Video Assessment

Your program faculty can’t be everywhere at once, but with smartphones and increasingly accessible recording devices, it’s never been easier for students to record their clinical experiences! With the right tools in place to assess student learning, this raw footage is rocket fuel for your students’ professional development – providing key insights into the formative and summative assessment process.

Tevera’s video assessment tool allows students to upload video recordings to an interactive platform that brings students, program faculty, and site supervisors together.

Once uploaded, students, program faculty, and site supervisors alike can review video footage and share actionable, time-stamped feedback, directly linked to the video.

Finally, program faculty and site supervisors can evaluate student competency development all in one place, so you can keep an eye on student progress across your entire program in real time!

Stakeholder Benefits

For Students:

Hone skills and drive professional success with contextualized, actionable feedback from supervisors.

For Instructors and Placement Supervisors:

Provide insightful and actionable feedback on students’ authentic client interactions at any time and place.

For Program Administrators:

Have peace of mind that students are getting the most out of their placement experience with all the resources your program and their placement site have to offer – supporting both their formative and summative learning outcomes!

Unleashing the Power of Video Assessment: Enhancing Student Learning and Supervisory Efficiency

Tevera's video assessment tools not only enhance student learning through improved feedback but also streamline the supervisory processes – ensuring that educational programs can deliver high-quality, consistent, and scalable support to their students.

By integrating video assessment into practicum management tools, placement teams can ensure comprehensive, effective, and accessible assessments for all students – preparing them for successful careers in teaching education, social work, counseling ed, and more!

A Single Solution for Powerful Program Management

Combine Tevera’s video assessment feature with our complete suite of field experience and program outcomes solutions for unbeatable program management that help students achieve their learning objectives! Schedule a product overview with our team to learn more.

SCHEDULE AN OVERVIEW

Scale Your Field-Based Education Programs and Improve Student Experience

Academia has always had a complex relationship with emerging technology, as is evidenced by its stance on AI tools being used in the classroom and in research.

But students, teachers, and administrators are demanding more efficiency and collaboration across physical boundaries, and the benefits to technology are expansive for day-to-day placement tasks and reporting.

To help you scale and improve your program, we’ve outlined the seven functions that are essential to have in your student placement management tool.

  1. Efficient Placement Programming: Clear and efficient mapping of all stages of your student placement program, from identifying the opportunity to completing evaluations, ensures a seamless transition of student data. Administrators must be able to track program requirements virtually and streamline the placement process through optimized placement workflows.
  2. Simplified Time Tracking: Allowing students to track hours while faculty and administrators support students helps them complete their field experience requirements on time.
  3. Integrated Communication: By having one software tool carry out all the steps in the placement process, communication becomes more accurate and efficient. Students should be able to interact with site managers, professors, and administrators, and vice versa, with communication logged for easy reference.
  4. Supervisor Qualification Tracking: Ensuring all supervisors and placement opportunities meet accreditation requirements through qualification tracking within your field education management tool improves the overall field experience as students are mentored by appropriately qualified individuals.
  5. Automated Workflows: Creating intelligent workflows and automated forms and reports in your software ensures centralized data collection, accurate progress updates, and up-to-date status reports on task completion – ultimately giving students a better experience and saving administrators time. Automated messages are also crucial to helping guide students back on track to meet deadlines and placement requirements.
  6. Collaborative Field Evaluations: Feedback that is timely and efficient is critical to building an effective student placement program. Your tool should empower faculty, students, and placement supervisors to share reflections on the experience in one platform which supports the overall placement process and students’ development.
  7. Robust Reporting and Assessments: The program placement process, student performance, and other key metrics must be easily measured in your management software – allowing you to identify areas of strength and weakness. Administrators should be able to export and share data with professors, placement supervisors, and students.

Build a Better Student Placement Program with Lumivero

We have the right student field placement tool to fit your needs! Contact us to start the conversation!

Start the Conversation

www.lumivero.com

This is a case study based on a current Tevera customer. Due to this university’s policies regarding promoting vendors, we have not identified the customer.

"Our students expect us to have technology. Higher education is an investment and if you’re not operating at a high level and you’re not creating a competitive space. We want our students to have access to this information.As a university we want to be progressive, we want to be creative and innovative and give students access to what they need. And we would love for them to do it without us needing to hold their hands and Tevera allows us to do just that.”"Dr. of Social Work, Field Program Director

Executive Summary

“Program X,” a well-respected social work program, had a problem. They were still using faxes and paper to manage their students’ field experiences, among other administrative tasks. These inefficient and frankly, cumbersome, processes were barriers to their students’ learning experience. The program’s field director knew something had to change.

“Program X” wanted a digital tool that would enhance student learning outcomes, make field placements and management easier, and most importantly, be able to grow with them into the future of social work. They hoped to find one that was the whole package, and were unwilling to compromise on the quality of partnership into which they were entering.

When they heard about Tevera, they were immediately attracted to how intuitive its interface was. But they vetted multiple programs, just to be sure and thorough.

In the end, the team decided on Tevera because it made field easier, saved time for their administrators, provided a life-long asset to their students, and was easily configurable to the unique needs of their program.

The Challenge

“Program X’s” field education director was fed up with using paper, faxes, and spreadsheets to manage their social work program’s data. From field to accreditation, data was hard to get and harder to generate reports from.

They knew that they needed to upgrade their processes by finding a software solution that could:

Mostly, they knew they needed a system that was customizable and could grow with their program and accommodate future changes.

If they didn’t adopt a digital solution, and soon, this university risked losing field site partnerships and declining enrollment in their social work program. The field director said, “we had supervisors who in the past had stopped supervising students because the paperwork was so onerous.”

So, they began looking at online solutions and talked to a different program that had started using Tevera and told them that “The support has been terrific and just wonderful. It has really created a lot of time saving in certain parts of the process. We’ve been very happy.” When she heard that, this doctor of social work decided to explore Tevera further.

Why this Social Work Program Chose Tevera

“Program X” ultimately chose Tevera because of how easy it was to use, its customization capabilities, and its features.

Field Placement: More Options for Students, More Time Saved by Field Educators

First, they loved how it simplified field management by allowing students to explore all of their placement options to find the best fit for them.

“Tevera is changing the way that our field program has run from before.”

Enhanced Collaboration between Students, Faculty, and Field Educators

The good news for field doesn’t stop with students. They also chose Tevera because it gives them greater insight into their agencies and allows them to run easy-to-use reports and look at where they’re missing different practice areas. This ability to perform an analysis of their agencies in a few seconds allows them to have greater control over their students’ field experiences and ensure they’re receiving the best possible outcome.

This documentation management for field also enhances collaboration between students, faculty, and site supervisors because it’s all in one spot and the assessment data spits out in one report. Before Tevera, their assessment coordinator would come into their office with boxes of reports and data. Now, it’s a few clicks of a button.

Dozens of hours have been saved by professors, supervisors, and assessment coordinators by using Tevera’s assessment management feature while giving them more insight into how different sites perform and their students’ learning outcomes.

“These aren’t questions that we have to put on a shelf any longer, these are things that we can go and run quickly.”

Configuring a Digital Future: Easy-to-Use and Meeting Student Expectations

When this field program director first came on board, original signatures were required on everything and faxing was the name of the game. They remember teaching a capstone class in their first year when a student raised her hand and asked, “is there a portal we could put this into?”

At that time, there wasn’t. But it was a wake up call. The students were there. They were expecting technology in their classrooms and to make their academic career easier. Looking into software became a necessity to remain competitive and continue attracting students. But technology changes fast, so they wanted a software that was easily configurable and could grow with them. Fortunately, Tevera met that and more.

“Tevera is going to play a really important role in our future. Its technology is very valid, it serves as an asset for our program and our university, and we’re going to use it to all of its capabilities.”

“What I really appreciated about Tevera was that it was so configurable. The options were so robust and you could turn on and off just about everything. And so what I quickly realized is that it was built the right way.”

A True Partnership

It wasn’t just about the software for the team. They wanted a team behind the software that would be accessible, responsive, and prioritize the customer-vendor relationship. Because Tevera is specifically designed to help social work programs thrive, this piqued their interest.

And on Tevera’s side, this program has been an invaluable partner, giving feedback, making suggestions, and helping shape the future of the software. Both of us are working towards the same goal – shaping the future of social work and to be able to collaborate on the software is a unique opportunity for both.

“We’ve had great support. We really appreciate the insight and the responsiveness of the team.”

“Tevera is really changing the way that our students and our field instructors are able to engage with one another and with their faculty liaisons.”

Interested in learning more about how Tevera can enhance your institution’s field-based program? Request a demo or reach out today to discover its full potential.

A Dive into Documentation, Evidence Gathering and Continuous Quality Improvement (CQI)

Accreditation management is a multi-faceted process that requires meticulous planning, systematic implementation, and ongoing oversight. Among the myriads of challenges faced by educational institutions during the accreditation process, two stand out as particularly daunting: Documentation & Evidence Gathering and Continuous Quality Improvement (CQI).

In this blog post, we’ll delve into these two critical areas, uncovering the challenges associated with them, offering viable solutions, and outlining how Tevera simplifies both.

Documentation & Evidence Gathering

Documentation and evidence gathering form the backbone of any accreditation process. This ensures that academic standards, policies, and processes within an institution are consistently upheld.

By compiling a comprehensive record of the institution’s practices, achievements, and areas of improvement, documentation serves as a testament to the institution’s commitment to maintaining the highest educational standards. Furthermore, it offers accrediting bodies a tangible insight into the institution’s operations, allowing them to assess its eligibility for accreditation.

Without proper documentation, it’s almost impossible to showcase an institution’s competencies and adherence to stipulated standards. Thus, comprehensive and accurate documentation stands as a non-negotiable pillar in the pursuit of accreditation.

Challenges

Challenges to accreditation management, especially documentation management and continuous quality improvement, are numerous, but these three might be the toughest:

Volume & Complexity

Accreditation requires a vast amount of documentation, which can be overwhelming given the complexity of topics covered.

Cross-Departmental Coordination

Gathering evidence often involves liaising with various departments, making the task even more challenging.

Timeliness & Accuracy

Accreditation bodies demand not just volume but timely and accurate documentation, which institutions often find challenging to produce.

Solutions

To address these challenges, a few solutions are often recommended by institutions that have successfully navigated their own accreditation processes:

Dedicate a CQI Team

A team solely focused on CQI can plan, execute, and measure quality improvement initiatives without being encumbered by day-to-day operations.

Institution-Wide Feedback Platforms

Utilizing technology to solicit feedback from students, faculty, and other stakeholders can streamline the feedback process, ensuring that all voices are heard and considered.

Outcome Metrics & Dashboards

Developing specific metrics to measure the outcomes of CQI initiatives can be complemented with visual dashboards. This not only helps in internal monitoring but also provides tangible proof of improvement to accrediting bodies.

How Tevera Simplifies Documentation & Evidence Gathering and Continuous Quality Improvement

Tevera’s all-in-one education management platform simplifies documentation management and continual program improvement through a variety of interrelated features including:

Flexible Assessment Structure

Tevera’s flexible assessment structure allows you to track student performance on key program-wide assessment standards, including those mandated by an accrediting body or your program’s own specific KPIs.

Align assessment standards to rubric criteria and distribute assessment points across courses so that all faculty can assess student performance on relevant, demonstrable criteria, using a consistent rating scale

Importance of this Feature

Tevera allows you to design a consistent, repeatable strategy for measuring student performance and competency development across your program for reliable insights into program and student growth. With Tevera, you can easily align assessment standards to rubric criteria and distribute assessment points across courses so that all faculty can assess student performance on relevant, demonstrable criteria, using a consistent rating scale.

Assessment Mapping

With Tevera, you can take a program wide view of your entire assessment strategy by viewing and exporting your assessment strategy, mapped out across all courses in your program.

Generate an assessment map report to review the assessment strategy that has been built into your program’s courses in Tevera.

Importance of this Feature

Reaffirm your assessment strategy and ensure all standards are evaluated appropriately throughout your program by viewing, exporting, and sharing your comprehensive assessment map.

Assessment Outcomes Reports

Generate reports displaying student outcomes by individuals, classes, cohorts, specializations, and select demographic characteristics to analyze student performance on one specific assessment point or across a set of standards evaluated throughout the program.

Importance of this Feature

Gain insights to inform accreditation reports and continuous program improvement efforts by generating assessment outcome reports in a wide variety of formats.

Data Exports

Exporting key program data to a spreadsheet for deeper analysis is as simple as the click of a button. Click the export icon alongside any table in Tevera to automatically generate an Excel spreadsheet of the displayed data.

Importance of this Feature

Take Tevera’s insights with you anywhere, by easily exporting any data, such as documentation, user information, student timesheets, partnering site information, and more, at the click of a button.

Outcomes-Based Assessment

Enable program faculty to evaluate student learning outcomes across the breadth of their coursework. Add key assignments and assessment rubrics to all core courses so that students can upload work samples and faculty can evaluate student outcomes aligned with student learning outcome targets.

Importance of this Feature

Foster a culture of reflection and improvement by incorporating outcomes-based-assessment into all core coursework throughout your program.

Data Storage and Organization

Organize your program’s data and documentation effortlessly and leave arduous record-keeping in your program’s past. Tevera’s intuitive data storage structure will ensure that your data and documents are automatically organized according to logical schema, making it easy for you to find, update, and export data any time you need it.

Importance of this Feature

Organize, update, store, and export data and documentation in an intuitive system, for thorough record-keeping and program management over time.

Learn How Tevera Can Transform Your Program

Tackling the challenges associated with accreditation management requires a mix of strategic planning, technology, and collaborative effort. Programs that invest in these areas are not only better prepared for the accreditation process but are also better equipped to offer superior educational experiences to their students.

Learn how Tevera can support you by scheduling a product overview for your team to review Tevera’s features and discuss your program’s requirements.

Schedule a Product Overview

Field-based student learning programs are an excellent way for students to experience and learn through hands-on activities. These programs allow students to immerse themselves in real-life situations and apply theoretical knowledge to practical situations.

However, program administrators and instructors must determine and evaluate metrics to measure these programs’ success. Here are three measurable metrics they can use.

Student Engagement

Student engagement refers to the degree to which students show interest and actively participate in the learning process. It’s a telling indicator of how much students gain from their field experiences.

Students’ engagement level often directly correlates with the quality of the field-based student learning programs. When students are genuinely engaged, they’re more likely to grasp complex concepts and develop critical thinking skills.

Field-based programs provide a unique platform for students to participate actively – encouraging them to ask questions, explore, and make discoveries independently. Engagement could manifest as enthusiastic participation in tasks, willingness to take on challenges, or persistent exploration of the subject matter.

For example, if students work on an environmental conservation project, they might actively engage by suggesting innovative ways to reduce waste or finding creative solutions for sustainability. Here, students are not merely passive receivers of information but are active participants in constructing their knowledge.

To evaluate student engagement in field-based programs, instructors can use a combination of methods such as pre-and post-program surveys, observations, and interviews. These can help measure students’ level of involvement before and after attending the program.

Evaluation can also involve tracking participation rates and positive attitudes toward learning the subject. Surveys and questionnaires are also great ways to gather quantitative data on students’ engagement levels for statistical analysis.

Knowledge Retention

The primary objective of any learning program is to impart knowledge. Field-based student learning programs offer a unique opportunity for students to apply theoretical concepts in real-life situations, making it easier to retain the information. This hands-on approach allows students to connect with the material in a more memorable and applicable way.

For instance, marine biology students will develop a deeper understanding of the subject after participating in a guided field trip to the ocean to witness firsthand the biodiversity, adaptations, and interactions between different species – making it easier for them to make connections and retain new knowledge.

To evaluate knowledge retention, instructors can use pre- and post-program assessments or quizzes. These tests can be designed to assess specific learning objectives and compare students’ performance before and after attending the program.

Additionally, instructors can use concept maps or reflective writing exercises to assess how well students have integrated the newly acquired knowledge with their existing understanding of the subject. Some programs also include follow-up assessments a few months after the program to determine long-term retention.

Skill Development

Field-based student learning programs not only focus on imparting knowledge but also aim to develop practical skills in students. These skills range from basic scientific techniques to communication, teamwork, and problem-solving skills. Unlike traditional classroom settings, field-based learning allows students to apply these skills in real-world situations – giving them a chance to develop and refine them.

For example, a field trip to an archaeological site can help students learn practical skills such as excavation techniques, artifact identification, and data collection. At the same time, they also have the opportunity to work collaboratively with their peers and practice communicating their findings in a way that is accessible to a broader audience.

Evaluating skill development in field-based programs can include assessing students’ competency in specific techniques, analyzing their group dynamics and communication skills, and observing their problem-solving abilities.

Instructors can also gather feedback from students through surveys or self-reflection tasks to evaluate how they perceive their growth in these areas. This data can provide valuable insights into the program’s effectiveness in developing students’ practical skills that they’ll need for future academic or career pursuits.

If you need to evaluate the success of your on-field learning programs, Lumivero is ready to help. Contact us today to learn how we can support your program evaluation needs and improve the overall success of your field-based learning initiatives.

Request a Demo

“Lift the lid” on the black-box of Monte Carlo simulation – to see that there is not a lot there! Yet it is a very robust (hard to make mistakes) and powerful technique – and the only one for correctly calculating outcome probabilities for “complicated”, real-world, uncertain situations such as football tournaments. Using the 2024 EUROs, the webinar will demonstrate:

So even if you are not interested in football, you’ll still get something out of this webinar! And if you are interested in football, this will be the “reveal” of Steve’s 2024 EUROs probabilities.

The quest for excellence begins in academia and extends beyond the classroom and into practical experiences. Crafting robust field placement programs is paramount for department chairs, professors, and researchers in education. These initiatives not only distinguish institutions but also shape students’ futures.

In this blog post, we delve into the significance of field placement software in enhancing educational experiences, focusing on the insights gleaned from department chairs.

Why Field Placement Software Matters

The correct field placement experience is critical for student development and education quality, especially in fields like teachers education and social work.  But with so many moving pieces from scheduling to finding the best experience to suit individual student needs, it can be complicated and time-consuming organizing these field-based opportunities.

Field placement software makes it easy for administrators and students to organize field placement experiences and track progress over time. Having the software in place to build these experiences offers many benefits to educational institutions including:

Insights from Department Chairs: The Education Perspective

In the recent survey Selection of Field Education Management Software in Social Work by Field Educator, the consensus overwhelmingly favored the integration of field placement software into academic frameworks.

Key findings included:

The implementation of software for field placement has resulted in significant enhancements in student satisfaction and helped nurture a more supportive learning environment, fostering a sense of belonging among students and empowering them to take ownership of their academic and professional development.

Tevera: Elevating the Field Placement Experience

Student placement software such as Tevera has been observed to provide tangible benefits including improved student outcomes and enhanced collaboration and knowledge sharing among faculty members.

Tevera empowers institutions to elevate their field placement programs and cultivate the next generation of industry-ready professionals by prioritizing student needs, facilitating effective communication, and providing data-driven insights. As department chairs continue to refine their educational offerings, embracing innovative technologies like Tevera is instrumental in shaping the future of experiential learning.

Below are some of Tevera’s key benefits:

Interested in learning more about how Tevera can enhance your institution’s field-based program? Request a demo or reach out today to discover its full potential.

We often discuss different qualitative data analysis methodologies on Between the Data such as different approaches to sifting and understanding the data that research generates. What we don’t often talk about are the practical, nuts-and-bolts aspects of coding in qualitative research – especially when coding with a team.

Lindsay Giesen is a Principal Research Associate at Westat in Rockville, Maryland. She has more than 15 years of policy research and program evaluation experience, and her work focuses on child nutrition and food security. On a research study conducted on behalf of the United States Department of Agriculture (USDA), she developed an approach to manage a qualitative data coding team that she believes improved the quality of the analysis and final report. Giesen joined Dr. Stacy Penna on Between the Data: Episode 19 to discuss her method for structuring, training, and managing qualitative data coders.

In this case study, we'll cover the highlights from their conversation including Giesen's coding approach, analysis process, and use of qualitative data analysis software (QDA) NVivo.

Research Project Background and Reasoning for a Guide to Coding Qualitative Research

Giesen and her team were tasked with conducting a study for the USDA’s Food and Nutrition Services. The goal was to understand how schools, school districts, and states gather data for federally funded child nutrition programs, including the National School Lunch Program and School Breakfast Program. This project involved site visits in four states during the spring of 2018. The team conducted 154 interviews with school cafeteria workers, school district officials, and state education officials.

The team had a six-week turnaround time to complete the qualitative coding of all interview data. Giesen and her team understood that it was essential to take a structured approach to training and monitoring their coders in order to meet their deadline. She learned three key lessons during this project:

A Multi-Layered Management Structure for Qualitative Data Coding Teams: Constant Communication, Better Collaboration

First, Giesen distributed the work for each coder. Her team had conducted interviews in four different states, so coders were each assigned one state to work on. Giesen stresses that these coders weren’t coming to the transcripts cold.

“The coders were the more junior staff who had been the support staff on the site visits that we did,” Giesen explains. “So they knew the data, they knew the interviews and the respondents.”

Next, each coder was assigned to a senior reviewer. The reviewers had served as the leads for each site visit.

The senior reviewers reported to the lead analyst – that is, Giesen. An outside methodologist was also brought in to provide objective feedback on the team’s processes.

“We each had sort of our own support person, which was just so helpful,” Giesen explained. “It gives you someone to check your work and also someone to bounce ideas around with and questions, and it just created a really nice structure for us.”

This structure was collaborative, with constant communication among the different layers of the hierarchy. This flow of questions and feedback allowed issues to be flagged early in the process so that the coding scheme and other processes could be refined.

Coding Methods: Training in Stages for Higher-Quality Coding

Next, Giesen and the methodologist organized a training for the coders and reviewers.

“I'd worked on coding teams before . . . [where] I would get my instructions and the list of codes and then be left to my own devices," said Giesen.

Giesen didn’t want this to happen to her coders, and she also didn’t want to overwhelm them with information. She took a multi-step approach to training.

Step 1: In-Person Training for Coding Qualitative Data

First, her team met for a one-day, in-person training session that covered essential coding skills and learning to work with NVivo qualitative analysis software. The first part of the day involved manual coding with paper and highlighters – line-by-line coding.

“There's a lot of noise in qualitative data,” Giesen said, “And you need to teach [coders] just conceptually how to sift through the noise to find what you're looking for.”

In-Person Training for Coding Qualitative Data

She had the coders work with practice transcripts, circling parts of a transcript and noting which codes from the code database she had created might apply. This progressed to working with Microsoft Word documents to highlight specific sections of text to code and deciding when to apply more than one code. Using a predefined set of codes and then assigning them to data is called deductive coding. The alternative is inductive coding, which is deriving codes from qualitative data.

“I find [NVivo] invaluable, and I think the team picked it up really quickly . . . we wanted to make sure that everybody had a shared understanding of how to use it.”

Step 2: Practice and Virtual Review of Coded Transcripts

After the one-day training session, Giesen moved the team on to coding the simplest, shortest transcripts – the school-level interviews. This would allow the coders to work with a manageable number of the 200 or so codes she had created for the project.

Giesen held a shorter remote training session in which she shared her screen and walked teams through which codes she expected would apply to the data. These training meetings also included a short practice session and time for questions. Coders and reviewers would leave these meetings tasked with choosing an assigned transcript to talk through with their reviewer.

Step 3: Q&A Session and Timeline Review

A few days later, the team re-convened virtually for a check-in meeting. Everyone brought questions from their assigned transcripts.

“It brought up places where the coders weren't sure which code to apply to certain pieces of text. And we would take that feedback and refine the coding scheme during that call if we needed to,” Giesen explained.

It also offered her team a reality check about how long each transcript would take to code the data so that deadlines and timelines could be adjusted accordingly.

Q&A Session and Timeline Review

Train, Practice, Code

This train-practice-code process repeated itself throughout the six-week coding period, as the team moved from the school-level interviews to those conducted at the district and state levels.

Train, Practice, Code

Creating Detailed Reference Materials Showing Steps for Coding Qualitative Data

One of the challenges of coding in qualitative research is consistency among the coding team. Even experienced coders can struggle to assign codes accurately without adequate reference materials, and these inaccuracies can lead to problems when it comes time to analyze and dig deeper into the data through thematic analysis – identifying patterns and themes.

Giesen’s 2020 article for the Sage International Journal of Qualitative Methods, “Structuring a Team-Based Approach to Coding Qualitative Data”, describes the reference materials she created to help ensure consistent thematic analysis coding. These materials included a Microsoft Excel codebook that coders could filter depending on the type of transcript they were coding (e.g., only displaying codes for school-level interviews).

There was also a blank copy of the interview questions at each level with a list of the most applicable codes assigned to each question and a sample transcript with codes applied. The codebook was edited throughout the six-week coding period based on team feedback.

Creating Detailed Reference Materials Showing Steps for Coding Qualitative Data

Giesen confessed that she was initially anxious about building all these different steps into the team coding process given how tight her deadline was. However, it wound up resulting in a stronger final product.

“Giving people endless opportunities to ask questions and continuously revising the coding scheme to better fit the data through our team meetings . . . made the whole process go really smoothly,” Giesen said. “Best of all, it made our analysis and reporting process so much better because the quality of the coding was strong.”

Learn More About Giesen’s Approaches to Qualitative Coding

Interested in more detail about managing teams when coding qualitative research? You can read the paper Lindsay Giesen co-authored in 2020 online at the Sage International Journal of Qualitative Methods.

You can also learn more about this research by listening to the full podcast episode here.

Code Qualitative Data with a NVivo Free Trial

Organize your data and improve your process of coding qualitative data by requesting a free 14-day trial of NVivo today!

REQUEST A FREE TRIAL OF NVIVO

Researchers used @RISK and PrecisionTree to model the likelihood of a successful evacuation during a volcano eruption.

University of Bristol’s Environmental Risk Research Centre (BRISK) adds new dimension to modeling volcanic risk in Guatemala.

Conducting a quantitative risk assessment is often a difficult process, requiring data that is sparse or even unobtainable. With volcanoes, the effects of uncertainty are accentuated by the potentially high costs of making a wrong call.

Guatemala has many active volcanoes, but none are as close to large populations as the ‘Volcán de Fuego’, potentially one of the most dangerous volcanoes in Central America with a large population surrounding it. Many farmers live and work in its shadow due to the fertile slopes that provide the best ground for coffee growing in the region. Large eruptions in 1974 fortuitously did not lead to any deaths, but buried in the volcano’s geological history are signs of ominous behavior.

Using Monte Carlo sampling to quantify the threat

The volcano has been very active over the last few years with many small eruptions taking place every day, and the fear that this activity could suggest the build up towards larger eruptions in the future is a worrying prospect. The “Instituto Nacional de Sismologia, Vulcanologia, Meteorologia e Hidrologia” (INSIVUMEH), regularly monitors activity at the volcano, however, despite the gallant efforts of the scientists there, no formalized risk assessments are carried out, mostly due to lack of funding and resources.

Recent work using Lumivero's (previously Palisade) The DecisionTools Suite however, is now enabling volcanologists to quantify the nature of one of the threats from the volcano to peoples’ lives. As an integrated set of programs for risk analysis and decision making under uncertainty, The DecisionTools Suite running in Microsoft Excel, allows access to Monte Carlo simulation and other advanced analytics quickly and simply on the desktop.

"DecisionTools Suite has proved to be invaluable in the work we are doing with INSIVUMEH, and potentially very useful for those living and working around Volcán de Fuego."DJonathan Stone
Unversity of Bristol

A different approach to risk assessment

Conventional risk assessments attempt to model the probability of a hazard and combine that with the vulnerability of the population, to create societal risk curves and estimated values of Individual Risk per Annum (IRPA). For many of the people living on the slopes and indeed the authorities, knowing the potential number of deaths or cost from an eruption is not entirely useful, as little planning control or mitigation can be carried out. In an attempt to increase the usefulness of the risk modeling to the end-user (the authorities and people living near the volcano), BRISK has looked at the vulnerability in a different way.

Normally volcanic risk assessments assume that the whole population is present in a location when a hazard hits. However, new work by BRISK has modeled the likelihood of a successful evacuation, using both @RISK and PrecisionTree, by inputting several variables obtained through a process of structured expert judgment. These variables, which include the time taken between a possible eruption and a possible hazard hitting a location, along with communication times from authorities and evacuation times, are each estimated with an uncertainty distribution by the experts. These expert views are then weighted and pooled together. The variables are then constructed together in a logic tree within PrecisionTree, with the end node either being evacuation or no evacuation – and the probability of these outcomes being quantified, with their uncertainties. When fed back into the @RISK (Hazard * Vulnerability) model, the effects of a potential evacuation on the risk is very clear.

Better planning and effective mitigation strategies

When looking in more detail at the model outputs from the logic tree, it became clear where the sensitivities were within the system. For example, it may be for a given location that the amount of time between a warning and the hazard hitting is crucial, or it may be that the time taken to evacuate is crucial. This new way of modeling volcanic risk informs better planning and more effective mitigation strategies.

Jonathan Stone, a researcher at the University of Bristol, working with colleagues Prof Willy Aspinall and Dr Matt Watson, said “DecisionTools Suite has proved to be invaluable in the work we are doing with INSIVUMEH, and potentially very useful for those living and working around Volcán de Fuego.”

Professor Willy Aspinall has been using @RISK software for some time in his work analyzing the risk of volcanic eruptions and earthquakes around the globe.

Originally published: Dec. 5, 2020
Updated: June 7, 2024

Researchers used @RISK and PrecisionTree to model the likelihood of a successful evacuation during a volcano eruption.

University of Bristol’s Environmental Risk Research Centre (BRISK) adds new dimension to modeling volcanic risk in Guatemala.

Conducting a quantitative risk assessment is often a difficult process, requiring data that is sparse or even unobtainable. With volcanoes, the effects of uncertainty are accentuated by the potentially high costs of making a wrong call.

Guatemala has many active volcanoes, but none are as close to large populations as the ‘Volcán de Fuego’, potentially one of the most dangerous volcanoes in Central America with a large population surrounding it. Many farmers live and work in its shadow due to the fertile slopes that provide the best ground for coffee growing in the region. Large eruptions in 1974 fortuitously did not lead to any deaths, but buried in the volcano’s geological history are signs of ominous behavior.

Using Monte Carlo sampling to quantify the threat

The volcano has been very active over the last few years with many small eruptions taking place every day, and the fear that this activity could suggest the build up towards larger eruptions in the future is a worrying prospect. The “Instituto Nacional de Sismologia, Vulcanologia, Meteorologia e Hidrologia” (INSIVUMEH), regularly monitors activity at the volcano, however, despite the gallant efforts of the scientists there, no formalized risk assessments are carried out, mostly due to lack of funding and resources.

Recent work using Lumivero's (previously Palisade) The DecisionTools Suite however, is now enabling volcanologists to quantify the nature of one of the threats from the volcano to peoples’ lives. As an integrated set of programs for risk analysis and decision making under uncertainty, The DecisionTools Suite running in Microsoft Excel, allows access to Monte Carlo simulation and other advanced analytics quickly and simply on the desktop.

"DecisionTools Suite has proved to be invaluable in the work we are doing with INSIVUMEH, and potentially very useful for those living and working around Volcán de Fuego."DJonathan Stone
Unversity of Bristol

A different approach to risk assessment

Conventional risk assessments attempt to model the probability of a hazard and combine that with the vulnerability of the population, to create societal risk curves and estimated values of Individual Risk per Annum (IRPA). For many of the people living on the slopes and indeed the authorities, knowing the potential number of deaths or cost from an eruption is not entirely useful, as little planning control or mitigation can be carried out. In an attempt to increase the usefulness of the risk modeling to the end-user (the authorities and people living near the volcano), BRISK has looked at the vulnerability in a different way.

Normally volcanic risk assessments assume that the whole population is present in a location when a hazard hits. However, new work by BRISK has modeled the likelihood of a successful evacuation, using both @RISK and PrecisionTree, by inputting several variables obtained through a process of structured expert judgment. These variables, which include the time taken between a possible eruption and a possible hazard hitting a location, along with communication times from authorities and evacuation times, are each estimated with an uncertainty distribution by the experts. These expert views are then weighted and pooled together. The variables are then constructed together in a logic tree within PrecisionTree, with the end node either being evacuation or no evacuation – and the probability of these outcomes being quantified, with their uncertainties. When fed back into the @RISK (Hazard * Vulnerability) model, the effects of a potential evacuation on the risk is very clear.

Better planning and effective mitigation strategies

When looking in more detail at the model outputs from the logic tree, it became clear where the sensitivities were within the system. For example, it may be for a given location that the amount of time between a warning and the hazard hitting is crucial, or it may be that the time taken to evacuate is crucial. This new way of modeling volcanic risk informs better planning and more effective mitigation strategies.

Jonathan Stone, a researcher at the University of Bristol, working with colleagues Prof Willy Aspinall and Dr Matt Watson, said “DecisionTools Suite has proved to be invaluable in the work we are doing with INSIVUMEH, and potentially very useful for those living and working around Volcán de Fuego.”

Professor Willy Aspinall has been using @RISK software for some time in his work analyzing the risk of volcanic eruptions and earthquakes around the globe.

Originally published: Dec. 5, 2020
Updated: June 7, 2024

magnifierarrow-right
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram