Perspective: How Covid-19 Impacted Monitoring and Evaluation—in 2 Examples
Thursday, July 30, 2020
Image credit: UnSplash.
“Social distancing” or rather “physical distancing”, which is what many (myself included) prefer to call it, has become the new normal in this Covid-19 world. Additionally, this virus’s ability to spread so easily, has left many unable to travel within and across country borders. As a monitoring and evaluation (M&E) researcher, who develops and implements research designs and data collection plans to measure socio-economic impacts from business operations and development programs, I’ve needed to adjust my work to these new realities.
Covid-19 has revealed many existing inequities across race, gender and socio-economic classes. Given this reality, it’s clear that entire M&E plans should not only be reviewed, but widely adapted. As difficult as these times are, they nevertheless represent an opportunity to expand on new research questions and gather knowledge on how varied groups of people are being impacted by the pandemic differently. Below are a pair of recent examples of how my team and I have adapted evaluation efforts in two programs in the health and tertiary education sectors.
Ask the right questions
Before I dive into the specific programs and the adaptations we made, I want to briefly share the process, to give a sense of the level of effort needed to design these changes. I began with developing contextualized questions for each program, structured around our framework used to develop M&E plans. These questions were also informed by a webinar organized by the Centre of Excellence for Development Impact and Learning and articles written by the World Bank and the Aspen Network of Development Entrepreneurs. Then, using these questions, I facilitated and engaged in a one-hour brainstorming exercise with the larger research teams on both programs.
Healthcare: This program aims to strengthen clinical laboratories in Africa through key activities including on-site mentoring, provided by our corporate partner. We started our brainstorming discussion with questions about the focus of the study i.e., evaluation questions. These focus on assessing the impact of activities provided to the participating labs via our corporate partner. We will continue with this focus, and will also explore how Covid-19 is impacting workflow and quality management systems at the lab. Critically, we will also explore how these activities have helped the lab with their pandemic preparedness to support continued funding of this program.
We next discussed what adaptations should we consider to the research design to capture potential changes in operations due to Covid-19. For example, our local partner notes that in one location, lab staff are dealing with reduced volumes as patients avoid hospitals. So, how should we measure the impact from this change in patient behavior, on labs? With this in mind, our response includes conducting one-time, 30-minute-long interviews with lab managers to answer these questions. These focused interviews will also help us understand capabilities of the lab under stress and the sustainability of program impacts. In the event that our partner replaces the currently paused on-site mentoring with virtual trainings, we may revise current metrics and adjoining data collection methods to measure impacts.
In this project, we collect the majority of data remotely; therefore, we do not need to switch from in-person to remote data collection. (Learn more about remote data collection here; additional resources can be found here). We do have one dataset that needs to be collected on-site, after completion of all program activities at the lab. Typically, our local partner collects this on our behalf. If corporate travel (within country and region) is still not permitted when the time comes to collect this data, our contingency plan is a telephone call with the lab manager. In such an event, we will remove questions from the data collection protocol that cannot be easily asked over the phone with the added advantage of shortening call duration.
Education: The second project involves an ongoing virtual exchange course offered at the University of Michigan in partnership with universities and institutions in Egypt, Lebanon and Libya. The course focuses specifically on international business and developing cross-cultural skills. In our brainstorming exercise, we revisited the evaluation questions, the research design and the data collection method. We found that no adaptations were necessary to any of these specific components of the M&E plan. Our evaluation questions focus on understanding changes in student’s knowledge of other cultures, their empathy, perspective-taking of persons from other cultures, and cross-cultural communication among other constructs. Measurement of these constructs remains essential in a global pandemic. In this quasi-experimental study, we have a comparison group of students across the four countries who as undergraduate students, are representative of enrolled students (the treatment group). These students are recruited by requesting enrolled students to ask a friend to join the comparison group using an incentive (a $5 to $10 gift card). As a result, we did not see the need to change our comparison group recruitment plan. And finally, because all data is collected through electronic surveys, we did not need to make changes to the data collection method.
The course, first offered in Winter/Spring 2020 transitioned from an in-person to a fully remote class when our partner institutions closed their campuses in March 2020. In this fully remote model, beginning about seven weeks into the course, every student connected virtually via the video conference platform—BlueJeans. Prior to that, students attended lectures in class, and all four classrooms were connected synchronously via BlueJeans. In the upcoming second semester, the class will be delivered online—and here’s to hoping that students will be back in class as soon as Winter 2021. Hence, given that the mode of instruction will be different in each semester, we considered if it is correct to compare and analyze data across the four semesters over which this course will be offered. We have decided to carefully document the implementation in each semester and share these caveats when reporting on the impact of the course over all semesters.
We also pondered the ethics of asking already stressed-out students to fill out surveys when they are juggling coursework and technology constraints from home, not to mention the quality of data they share. Hence, we cut unnecessary questions, and added a question about how Covid-19 has impacted the students’ ability to learn the material, pay attention via virtual learning in their homes, and complete assignments in times of stress. The goal is to use this data to improve the instruction delivery and the assignments to address the concerns raised. This highlights the fact that monitoring and evaluation is not just for accountability and reporting, but also to adapt the program in real-time, in changing environments (e.g. a global pandemic), in order to meet the goals of the program.