Continuous quality improvement (CQI) has been in common use as a business model since the 1920s.Reference Krishnan 7 It is now an emerging strategy in the field of Humanitarian Response, enlarging the toolkit of first responders in disaster management and response. CQI is particularly well-suited to community-based programing in disasters because community engagement at scale is essential for implementing evidenced-based solutions. We use our experience in Sierra Leone and the Ebola crisis as a demonstration of the value and impact of CQI. As we will demonstrate in this case study, CQI utilizes real time feedback loops to provide data for decision making at the front line, not just at command central. CQI facilitates incorporating local actors as well as an understanding of local perceptions of how disease processes work. In so doing, it addresses relevant cultural practices and supports an informed messaging strategy.
The 2014 Ebola epidemic in Sierra Leone began with a rapid spreading of the virus as a result of multiple interacting factors.Reference Arab-Zozani and Ghoddoosi-Nejad 1 These included the slow recognition of the dangers Ebola posed, the lack of information at the household level, and an overall weak health system. These factors compounded to delay information from reaching people in positions of power and hampered the coordination of a large-scale response to stop the disease.Reference Arab-Zozani and Ghoddoosi-Nejad 1 As the epidemic unfolded, the behavior of Sierra Leoneans was increasingly motivated by fear due to the lack of consistent messaging from all levels of government. This, in turn, caused the national government to become increasingly restrictive, halting border crossings as well as local travel. These restrictions had an unintended effect of spiraling fear of the disease, without driving any effective messaging on prevention. Because many of the early deaths from Ebola were Health Care Providers, fear was pervasive and included health care workers. Community Health Clinics were abandoned out of fear for the virus.Reference Arab-Zozani and Ghoddoosi-Nejad 1 By October of 2014, approximately 5 months after the first Ebola cases were identified, Sierra Leone experienced a simultaneous collapse of the health system and local and national governance structures because of fear. The government created significant delays in declaring a national emergency, according to several documented sources.Reference Harris 16
The Sierra Leone population was no longer sure who to trust or where to turn. This generated fear-based responses on the population level. One example was to disregard circulating information which had the logo of the ministry of health, even if it was being produced by disaster response organizations.Reference Bennett, Agyepong and Sheikh 3 Myths about the “strange disease” abounded. Prevention measures such as restrictions on washing and touching dead family members conflicted with existing cultural practices and were, thus, not trusted. In Port Loko, an urban center in the West of Sierra Leone, there was a widespread lack of confidence in the emergency health care systems that had been set up by the local government, which included foreign, military, and international non-governmental organizations (INGOs) who had arrived to assist.Reference Arab-Zozani and Ghoddoosi-Nejad 1 Despite their best efforts, by early 2015, the district was coping with increasing infection rates and death.
As demonstrated by our case study, outlined below, disasters are often dynamic in nature, even if they result from a single event. They require programming that is “responsive” to the ever-evolving situation.Reference Callaway, Yim and Stack 2 Continuous Quality Improvement (CQI) allows models to be flexible and adaptive, with a Plan-Do-Study-Act (PDSA) loop applied to each of its interventions.Reference Bennett, Agyepong and Sheikh 3 Real-time field level data drive such dynamic feedback loops. Often models of disaster response assume the underlying state of things to be static and approach things in a linear way, both in terms of planning and subsequent implementation.Reference Christoff 4 Such programs might plan and do but fail to further study the effectiveness of their interventions and act upon the new reality. “Unresponsive” implementation strategies increase problems instead of resolving them.Reference Feser 5
Emergency response actors often arrive with a pre-existing agenda aligned with their own experiences or institutional directives, as stated by Vasovic.Reference Harris, McGregor and Perencevich 6 Importantly, they expect a central coordinating agency to work hand in hand with national and local governance systems. These central coordinating agencies heavily rely on local data input, which is usually only available at the national level. Many, therefore, struggle without community level data due to lack of details. Feedback loops allow for the appreciation of community and household level data. A good example of the application of feedback loops in public health was presented in 1994 by Rissel.Reference Means, Wagner and Kern 8 They specifically discussed the role of “Community Empowerment” as a process that centers on a sense of community and results, by means of feedback loops, in community members obtaining control over their own resources, and eventually gaining autonomy in the emergency response process. They also identified the critical importance of access to evidenced-based medical information in the case of an infectious disease-induced disaster. There are many examples of emergency response actors developing information messages with limited indigenous contributions, distrusting local cultural-driven communication networks (e.g., word of mouth, vernacular radio programs, or community meetings), and with “unadapted” timeframes.Reference Paek, Niess and Padilla 9 Messages with insufficient cultural sensitivity may be technically correct but misunderstood, rendering no benefit at the household level where disease is being transmitted and decisions related to behaviors are being determined. This problem of communication further contributes to a general lack of trust encountered in the communities.
In developing this paper, we also undertook a structured rapid review of the current literature using the following keywords: CQI, quasi experimental study design, plan do study act cycles, emergency response, rapid response, community based programs, and emergency response evaluation/qualitative/monitoring tools. There is significant research published about gathering qualitative data during an emergency response, but very little written about how to analyze and effectively improve programming based on that data. We discovered very few articles in peer reviewed journals that addressed completing the PDSA cycles and incorporating CQI into program design in the emergency response setting. There was a case study published by UNHCR in Skopje that came closest to highlighting our findings and conclusions, but clearly more research needs to be done in varied emergency response settings. 13
Methodology and Approach to Data Collection
Based on our success with CQI in the humanitarian context, we felt compelled to share how we had utilized this tool, which is mostly associated with business models and health care systems. In this paper, we describe a dynamic CQI model using continuous real-time data feedback loops. Feedback was sought from all stakeholders. The usual disaster response approach is to include feedback at a national level and/or local aggregate level. CQI described here differs. Engagement of the front-line local community was required to gain a true field perspective. We needed to ask why in order to understand the context. This meant the responding team from PIH had to ask the specific detail of why certain behaviors were being chosen or avoided by the local population. Specifically, each member of the response team, which included 10 Sierra Leoneans and 1 consultant from Partners in Health, collecting data was charged with seeking answers to 3 questions continuously in their interactions with community members, government officials, other responders, and health care providers. The questions were: What is working? What isn’t working? How can we do better? These questions were asked and answered using the Socratic Method; then, qualitative data were noted and shared during planned weekly meetings with direct supervisors. Qualitative data were received by the 11 office staff mentioned above and written into a shared report providing critically comprehensive feedback to decision makers at local parish, district, and national levels. This resulted in continuous adjustments to programming at all levels of implementation. The listening and learning posture of CQI engenders trust and thus improved compliance to containment messages. It also resulted in improved coordination between different arms/actors of the external response team as well as population level outcomes.Reference Feser 5
Case study: Feedback Loops as Part of a CQI in the 2015 Sierra Leone Disaster Response
Partners in Health, in cooperation with the Sierra Leone government, developed a responsive community health worker (CHW) network in the face of a collapsed health system in order to support the emergency medical response and to extend the emergency health system into the impacted communities. The government of Sierra Leone adopted a Community Health Worker framework well before the Ebola epidemic and was, therefore, familiar with the benefits of community-based interventions. The geographic areas covered by this program included Lokosama, Port Loko, and Kaffu Bullum. All Chiefdoms within the District of Port Loko in Sierra Leone. The population was roughly 260, 000. A 4-arm program model was developed after an initial rapid qualitative assessment identified gaps in the disaster response.Reference Qiu and Chu 10 The gaps identified are listed in Table 1. Rapid feedback loops were incorporated into field program design to address the gaps in the disaster response. Examples of gaps identified via feedback loops and how they were addressed can be seen in Table 2. These feedback loops included weekly meetings with direct reports and direct supervisors, facilitating the movement of this feedback rapidly to the decision makers for the response.
The CQI program integrated information from many different sources (program staff, other NGO, government partners, the British military, community leaders, and community members) on a regular basis. Feedback loops were utilized at all program levels, according to plan-do-study-act cycles and informed by real time field data (Nally et al 2021). Clear lines of communication were delineated in each level of the program management structure (who reports to whom and where should information flow to and from) so that meaningful data for decision making does not get lost. At each level, leaders are identified and have the important responsibility to continuously gather and share the information needed (Figure 1). Too often data are gathered via feedback loops but largely remain ignored.Reference Serenko and Bontis 12 The key to feedback loops is their cyclic nature (plan-do-study-act) where data lead to identification of potential gaps and adjusted decision making to improve program delivery.
Figure 1 lays out the feedback loops occurring at each level of the management structure during the Ebola Response in Sierra Leone. We can see that weekly the gathering of information and feeding it both “up and down” the management structure is integral to the design of the program. This model can be adapted and used in many settings by emergency response programs. These models require leaders at each level to accept responsibility for gathering and sharing information continuously. Through this constant cycling of information, the program activities can be adjusted immediately to accommodate the evolving disaster or close gaps in implementation.
The Feedback Loops and PDSA Cycle in Practice
The key to this process is utilizing the information gathered to inform decision making for the program. There are countless instances where information is gathered in this way and then ignored due to various biases.Reference Serenko and Bontis 12 By incorporating the feedback, you can build trust in the program and implementors, respond more effectively to the changing nature of a disaster, and ensure the resources are used to the greatest impact. Below are some examples from this program highlighting information the feedback loops provided that was acted upon to improve program delivery and impact. In most cases, other organizations or people had begun the PDSA cycle, but not completed it, thus stalling or slowing the response and impeding its adaptability to the evolving context. After these highlighted cycles were implemented, the feedback loops continued to validate or highlight gaps in the implementation and disaster response, and this meant that each of the 4 arms of programming could be adjusted.
Discussion
In the above case study, we demonstrate how a structured plan-do-study-act approach can rapidly highlight issues related to implementation of emergency response programs. Once information was fed back to emergency responders, they acted to rectify these gaps in coordination and implementation.Reference Qiu and Chu 10 Together with the Sierra Leonian government, they developed and integrated community-based responses as part of broader CQI feedback loops.Reference Hoffman 14
Feedback loops creating data for decision making are a part of CQI, but where and how this information is gained is often ignored or its importance is diminished at the national coordination and implementation level.Reference Rissel 15 Many disaster response models do not build in community-based feedback loops. Medical data on case findings is gathered and pushed out, but community level implementation data received from community actors who are doing the case finding is not considered relevant to the implementation and impact. Therefore, it is not analyzed or used to inform programmatic decisions. Household level decision makers need data they can trust. Too often in disaster response, education is viewed as a tertiary program and not integral to reducing FBR’s and integrating the community. It can be easier, in some cases, to rely on fear as a motivator for compliance. Two examples of this are highlighted in our case study. These challenges were remedied once frontline information was fed back to the decision makers at the national coordination level and at the household level.Reference Arab-Zozani and Ghoddoosi-Nejad 1
UNHCR published an interesting case study highlighting the importance of feedback loops in building trust in emergency response and disaster settings. They propose a similar structure to ours and highlight their own success in completing the PDSA cycles. However, there were very few other field-tested examples, and none that focused on the importance of closing these PDSA cycles completely and having them run continuously during the implementation period. The importance of feedback loops has been highlighted anecdotally most recently during the Global Coronavirus Pandemic. Many governments and responding bodies have struggled with messaging and securing compliance to restrictions by the wider population.Reference Arab-Zozani and Ghoddoosi-Nejad 1 This has necessitated the use of feedback loops, the CQI process, and PDSA cycles, whether formally or not,Reference Qiu and Chu 10 thus bringing their importance to the forefront of our current global public health climate.
The tendency of disaster responders to arrive with a prepackaged or preconceived idea of how the response should proceed ignores the impact and importance of indigenous systems and belief to the detriment of the health and lives at stake.Reference Feser 5 CQI is not often thought of as the method for responding to evolving disasters but as the science of QI and the ability to use real time data for decision making, hallmarks of both good disaster response and QI processes. For this program, community members were directly involved in the PDSA cycles.Reference Feser 5 CQI uses many small feedback loops to test both process and outcome measures.
While in practice feedback loops and quick responsive program adaptations do increase trust and impact in disaster response, it becomes challenging to effectively measure impact over time. Traditional models of assessment are difficult when parts of a program or implementation plan are constantly evolving. You can have a data point to start with and a clear idea of where you hope to end up, but many models of quantitative and qualitative research require periodic measurements of the same data point, which becomes difficult if the program evolves and that data point is no longer relevant to creating the outcome hoped for at the start of the process.
The case study of Ebola in 2015 provides an excellent example of an evolving epidemic that requires the ways real time data feedback loops allow implementation adjustments to programming to reflect and impact the situation as it evolves. In this paper, we demonstrated how simple feedback loops produced data guiding our response adaptation to help “keep up” with the ever-evolving epidemic and community needs. Further field testing is necessary to understand how traditional measures of success can still be applied to disaster implementation.