Skip to main content
Community Contribution

What is a "Learning Review": The Experience of SDRM-SI and USAID/Ethiopia

May 06, 2022
Chelsie Kuhn

What?

From March to December 2021, the Strengthening Disaster Risk Management Systems and Institutions (SDRM-SI) Developmental Evaluation (DE), implemented by Headlight Consulting Services, LLC, worked to conduct a Learning Review. The DE team took a Learning Review approach because unlike a desk review or literature review, a Learning Review systematically triangulates and substantiates findings with a higher degree of scrutiny for validation and sharing instead of merely summarizing the evidence. Getting to this level of scrutiny and nuance of detail allows for highly-reliable findings from the data and more targeted recommendations to move stakeholders to action. This Learning Review effort involved coding and analysis of 216 documents from USAID/Ethiopia and leaned extensively on Headlight’s local Ethiopian DE Evaluator team of four staff for a majority of the process (80 percent of the total level of effort). This example of localized evidence generation/synthesis demonstrates both the value and cost-effectiveness of creating space for local capacity building opportunities, as this approach ensured that the final report had appropriately contextualized conclusions that were able to be ground-truthed and framed for uptake by various stakeholders ranging from USAID to international implementers to local government officials to local NGOs.

How?

The SDRM-SI DE team solicited all DRM-related documents from our colleagues at the Mission to build an Evidence Catalog of all of our data and organize how we would pursue our team effort. We then moved into primary coding of documents to tag all excerpts that were relevant to the codebook we designed based on the DE’s learning questions. Once all documents were coded, we then moved into secondary analysis to try to get to the next level of use-focused nuance of all coded excerpts under a particular code; this generally involved consolidating all excerpts under one code, then organically coding them to sort and find any sub-trends and distinctions (i.e., under the “What Works>Capacity Building” code, a sub-trend we may find in the secondary analysis could be “Successful Training Sequencing”). We then moved our work into the Findings, Conclusions, and Recommendations Matrix to ensure that the recommendations were informed directly by the findings to best enable stakeholders to use the information to identify and plan for adaptations. For a step-by-step on how Headlight conducts Learning Reviews, see our Learning Reviews: Using What You Already Know blog post.

Why?

At Headlight, we feel that it is our responsibility to continue to use international development funding wisely, and one way we can continue moving the field forward with this in mind is to understand and analyze what information already exists. In this vein, the SDRM-SI DE team wanted to start the DE by understanding what has already been done in DRM implementation and what evidence exists, that way the data could be used to streamline the design of targeted learning questions and evaluative efforts, avoiding duplication of data collection. The ability to understand what has already been done through a Learning Review:

  • saves more resources for evaluative efforts;
  • deduplicates evidence generation efforts across implementers;
  • builds stronger relationships with local implementers through early familiarization with work completed to date;
  • creates intentional space to get caught up to speed; and
  • improves understanding of potential opportunities to support implementers’ adaptive actions based on past challenges and patterns.

Outputs?

As the first Project-level DE at the Agency, our team is trying as much as possible to ensure that our information is reaching various audiences including USAID/Ethiopia Mission leadership, SDRM-SI activity designers, implementing partners (IPs), USAID/Washington Operating Units and bureaus (e.g., the Bureau for Humanitarian Assistance (BHA)), broader DRM stakeholders within Ethiopia, and others making data-driven decisions. Having a wide variety of stakeholders has empowered us to develop different deliverables in addition to a full report that documents all of our findings, conclusions, and recommendations in one place. For example, our team hosted a Strategic Learning Debrief workshop to help IPs digest the findings and to contextualize those lessons learned to their activities and begin to identify potential adaptations. Separate from IPs, our team has also pulled together a number of shorter briefs, memos, slide decks, and visual summaries to support other major decision points including activity design research and iteration workshops, co-creation workshops, and disaster preparedness planning. Being able to adapt the presentation of the information into different kinds of contextualized deliverables has helped to increase the chances that the findings are processed, integrated, and used to map the evidence available to strategies and decisions made for evidence-driven development interventions.

Emergent Outcomes?

While it is still too early to substantiate outcomes and the Learning Review’s contribution, the SDRM-SI DE team is tracking how the Learning Review has:

  • informed evidence-driven theory of change development for new concepts and activity design;
  • contributed to conversations around knowledge management and evidence synthesis to support improved use across the Mission; and
  • supported strategic funding conversations to ensure investments are targeted to the most pressing needs and effective implementation strategies.

Now What?

DE Points of Contact embedded in each relevant Activity will continue working with the Mission and IPs to further plan and execute the adaptations identified from the Learning Review’s Strategic Learning Debrief. Simultaneously, now that the SDRM-SI DE has completed the Learning Review and has mapped the findings to the applicable DE Learning Questions, the DE will begin to pursue prioritized evaluative efforts including an institutional architecture assessment of the DRM policy system, climate shock evidence building to support La Niña response, and efforts to explore crisis modifier uses and outcomes. After this round of evaluative efforts has concluded, the DE will synthesize evidence to assess which DE learning questions have been answered and what remaining questions are still unknown. We will then implement cyclical evaluative efforts until prioritized questions are answered, and work to identify new questions to meet emergent, targeted, and successive actionable evidence needs until the end of the period of performance.

Questions and Follow-Up Contact Information?

For any further questions about this Learning Review or the related SDRM-SI DE, please contact the DE Team Lead, Dr. Yitbarek Woldetensay at [email protected], the DE Administrator Lead, Chelsie Kuhn at [email protected], or the Senior DE Administrator, Rebecca Herrington at [email protected].

About the authors
Chelsie Kuhn

Chelsie Kuhn is a CLAME Specialist at Headlight Consulting Services, LLC and is the DE Administrator Lead for the Strengthening Disaster Risk Management Systems and Institutions (SDRM-SI) Developmental Evaluation (DE) at the USAID/Ethiopia Mission. She brings a strong background in supporting efficient, effective, and evidence-based operations both with NGOs and in academia managing large groups of diverse stakeholders and a data-driven mindset to project management with six years of dedicated professional experience.