Learning About Learning

Jun 7, 2018 by Rebecca Flueckiger Comments (0)
COMMUNITY CONTRIBUTION

Rebecca Flueckiger is a MERLA Operations Research Specialist with RTI International.

RTI’s Monitoring, Evaluation, Research, Learning and Adapting (MERLA) Community of Practice hosted a panel of distinguished guests to discuss USAID's Collaborating, Learning, and Adapting (CLA) approach at an event titled “From learning to adapting: How do we get to learning, and where do we go from there?”  Participants included Heidi Reynolds (MEASURE Evaluation), Easter Dasmariñas (USAID/Philippines, LuzonHealth Project), Tara Sullivan (K4Health), and Stacey Young (USAID/PPL). The event brought together an audience of nearly 250 individuals representing over 40 organizations (both in-person and online) who contributed to a rich and engaging discussion on CLA. If you were unable to attend or wish to review the discussion, the full recording can be found here.    

Here are a few of our top takeaways from the event:  

BUILD IT INTO WHAT YOU’RE ALREADY DOING

It is essential to build CLA into the intervention in a thoughtful and rigorous manner at the beginning of implementation to achieve better development outcomes. When putting together M&E plans, program implementation plans and work plans, practitioners should infuse them with learning and build a learning agenda that links all elements together. Heidi Reynolds described the challenge of “building the ship as we sail” and advised us to not start from scratch but build on the work of those who came before us and think forward on how others can build upon our work. Stacy Young’s words particularly resonated: “CLA is not a hobby you do on the weekends; it has to be an integral part of day-to-day programming.”  
 
MOVE AWAY FROM LINEAR APPROACHES

We must shift away from traditional linear approaches toward adaptive management and systems approaches and remember that CLA is informed by AND informs adaptive management. Easter Dasmariñas explained how the USAID-supported LuzonHealth team became champions of CLA. While gathering and learning from their data, the LuzonHealth team realized that they were not doing intentional learning and adapting. They then changed gears and took a holistic systemic approach (MERLA). Through the application of MERLA, they augmented existing programmatic M&E with operations research and learning best practices and approaches. They further incorporated USAID’s CLA approaches and tools to ensure that they were not only synthesizing program learning, but also using it to inform programmatic adaptations, policy decisions, and communications and dissemination, both internally and externally. Easter explained: “Adaptive management requires openness and flexibility with local stakeholders. Having an open venue for exchange can help address issues of underperformance.”  

MEASURE YOUR WORK AND LINK IT TO VALUE

Rigorous measurement of program implementation is key to providing data for programmatic decision making. Equally important is rigorous measurement of how CLA approaches and tools lead to enhanced program implementation and adaptive management. We need good knowledge management practices, tools and metrics to enable us to quantify and qualify learning, information and evidence related to program implementation and CLA. However, many implementers struggle with how to incorporate knowledge management best practices into their work. Tara Sullivan shared tools - developed by K4Health – to make this measurement easier and more standardized. Tara started off her discussion with the following perspective on knowledge management and how it links to CLA: "Knowledge management is really that enabler that we need for our learning and adaptation work."  

KEEP THE FOCUS ON THE "C" IN CLA
 
While it is easy to take the C in CLA for granted, it is important to recognize the power and value of Collaborating to ensure that Learning can proceed to Adapting. Stacy Young advised us to start small and pilot our work, identify and engage with natural champions, start with low-hanging opportunities, build on early wins, encourage and enable partners to own wins, link with leadership to highlight the value of CLA, and be inclusive. We can have the best systems, research, and learning, but learning may not turn into adapting if we do not also have collaboration as the glue to ensure that stakeholders own the learning and are part of the decision making.   
 
Through the discussions and Q&A many other points emerged that deserve further exploration, including streamlining learning deliverables, addressing the complexity of learning agendas, implementing user-friendly systems and adaptive learning loops, and how to build and foster learning in the field. Stay tuned for more on this from the RTI MERLA Community of Practice (MERLA@rti.org), #RTILearns.

COMMENTS (0)

Ways to Integrate a CLA Approach in the Performance Management of a Project/Activity

May 18, 2018 by Motasim Billah Comments (0)

Motasim Billah is a Monitoring and Evaluation Specialist at USAID/Bangladesh.

Since the revision of ADS 201 that mandated us to include Collaborating, Learning and Adapting (CLA) in our work, the M&E team in Bangladesh started receiving growing demands from A/CORs and Implementing Partners to provide them a practical guide to integrate CLA approaches in the performance management or monitoring and evaluation of projects/activities. As the operationalization of CLA has been evolving for the Agency itself, there have been more opportunities for M&E practitioners like us to share reflections from our field experiences, which could ultimately contribute towards developing a comprehensive guide in this area. My desire to engage more deeply in this conversation became a reality when I secured the Program Cycle Fellowship with the Bureau of Policy, Planning and Learning! During my Fellowship, I was based in the office of Learning, Evaluation and Research where I intensively focused on CLA.

The Fellowship provided me opportunities to gain cutting-edge knowledge on CLA through my involvement in different CLA related processes such as the Program Cycle Learning Agenda, CLA Community of Practice, adaptive management workstream, and CLA Toolkit development. It also provided me access to a wide range of resources, including different Missions' experiences on CLA and experts' opinions on integrating CLA into monitoring and evaluation. My time with PPL helped me organize my thoughts on CLA and write these reflections. The write up will be divided into three sections that will shed light on ways to integrate CLA into performance management.

The first section will spell out CLA in the logic model development and indicators framing. The second section will show how to integrate CLA in the MEL plan, data quality assessment and evaluation. The final section will demonstrate using CLA in tracking critical assumptions/game changers, establishing feedback loops and Mission wide practices for instituting CLA in the performance management of projects/activities.

Integrating CLA When Developing Logic Models and Indicators

Robust Logic Model

The development of a robust logic model is critical to enable the performance management system of an intervention (that is, a project or an activity) to function and to capture performance in a complex environment. Constructing a robust logic model requires analyzing a development problem from different perspectives, identifying the root causes of the problem and its linkage with other contingent problems, and tailoring solutions suitable for a particular context.  Building a rigorous logic model requires designers to invest a significant amount of time and ensure active participation of stakeholders in the construction phase. In this respect, adopting a CLA approach is useful throughout the process of developing a logic model. 

At the onset, a project or activity should identify stakeholders who can provide substantial insights in crafting the logic model. Once stakeholders are identified, experts in CLA or design could facilitate a logic model workshop to surface the best knowledge, expertise and experience of stakeholders. A robust logic model may involve multidirectional causal pathways of solutions to a particular problem using a two staged process.

  • First, it identifies the core results where an intervention will be directly intervening based on resources and manageable interest.
  • Second, it uncovers other potential results that are also critical for the achievement and sustainability of its core results; where the project or activity will be leveraging from the presence of interventions of different development agencies, NGOs, private sector and governments.

USAID/FFP has been using the robust theory of change and logic model in its program in different countries which can be a useful guide for other USAID programs.

Designers can also use the collaboration mapping tool  (learn more here), developed by USAID/Rwanda and refined by PPL/LER, to unearth the additional actors operating in the targeted geographic areas. It can then rank the agencies and their respective interventions in terms of the benefits that the intervention can tease out and their effectiveness in achieving and sustaining our results. For example, in Bangladesh a USAID environment activity partnered with the Government, which allowed the activity to set up its district/sub-district level offices within the premises of Government Fisheries Agency. This substantially helped the activity reduce logistical costs and strengthened the partnership with the Government. Other examples include joint project development such as when USAID and DFID collaborated in a major NGO health service delivery project in Bangladesh. A designer can also do a beneficiary mapping exercise to reduce overlaps with other interventions in the same geographic region and thus maximize developmental gains for the target population. To document plans and efforts to develop partnerships, designers could include any collaboration map and stakeholders engagement strategy as an annex to a Project or Activity MEL plan.

Collaboration on Indicators 

The logic model workshop can be also used to extract a set of illustrative indicators to measure the result statements in the logic model.  The illustrative indicators will subsequently guide the development of intervention-specific indicators that would be documented in the Project/Activity MEL plans. Once an activity starts rolling out the Agreement/Contracting Officer's Representative (AOR/COR) could periodically (e.g., quarterly) hold indicator review meetings with Implementing Partners and other relevant stakeholders to assess the effectiveness of indicators in capturing performances and other factors influencing the activity. In this regard, data quality assessments conducted by both USAID and Implementing Partners can be good occasions to review indicators. At the Project level, Project Managers could organize similar indicator review meetings with AOR/CORs to learn about the status of indicators and their effectiveness. The participation of the Program Office in the project level indicators review meeting is critical as that would help later align the strategy-level indicators with projects as needed. If a project or activity needs to revise its indicators, it should be adequately reflected in the MEL plan. 

A CLA Approach in MEL Plans, DQAs, and Evaluations

Including a learning agenda in MEL plans

A project/activity MEL plan should devote a section on learning that would essentially include a learning agenda at the project/activity level. A learning agenda generally entails a set of prioritized questions addressing critical knowledge gaps. In terms of scope, the questions can ask about short, medium and long term issues that are critical for the achievement of results of an intervention. In this respect, a project-level learning agenda can guide activity-level learning questions, and a Mission-wide learning agenda can guide project-level learning questions. For example, in the recent times, the Senegal Mission has developed a learning plan as part of their Performance Management Plan (PMP) that can help projects and activities articulate learning questions in their respective contexts. The learning section should include the learning activities that would be employed to answer each learning question.  It should also include target audiences, learning products (dissemination tools) that will be used to share learning outcomes; roles and responsibilities of different actors, timeline, resources and next steps.

Data Quality Assessment

The periodic data quality assessment is an important reflection tool for USAID and implementing partners to learn about data quality, gaps in existing data collection processes, data storage and overall data management. A CLA approach can be very effective in conducting DQAs involving USAID, Implementing Partners (IPs), and the local NGOs who are often partners of IPs. Based on DQA findings, periodic (quarterly/bi-annual) reflection sessions could be organized at the activity level involving all sub-partners of IPs that would provide opportunities to take course correction measures while identifying data strength and areas of improvement. At the project level, a pause-and-reflect session on 'learning from DQAs' could be organized at the regular Implementing Partners' meetings. The session would help both USAID and IPs learn from each other's experiences in managing data in order to strengthen the Mission level performance information management system. In this regard, it would often be useful for the DQA section in the MEL plan to clearly describe 'the collaborative practices/activities' that would be undertaken to conduct DQAs and share the practices.

Evaluation

Evaluation is an effective tool for capturing systemic learning from the grassroots level. A collaborative approach involving the Program Office, Technical Offices, and relevant stakeholders in developing evaluation scopes of work can be instrumental in uncovering the most pressing issues in connection to implementation and management. In this regard, Project Managers and AOR/CORs should take a lead to consult with beneficiaries, implementing partners and relevant stakeholders in order to frame 'good evaluation questions.' While framing evaluation questions, it is helpful to explain how they relate to, or contribute to answering, at least one learning question on broader issues, for example, questions that test the development hypothesis or critical assumptions, or inquire about external factors such as local contexts or national/local level policies which might influence interventions. The Bangladesh Mission has recently started the practice of including learning questions in its evaluation scopes of work. The evaluation section in MEL plans could explicitly describe how evaluations, to be conducted in the life of a project or activity, contribute to answering learning questions.

The dissemination of evaluation findings should extend beyond out-briefs of the evaluation team and uploading the document to the Development Experience Clearinghouse (DEC). In this regard, innovative approaches can be followed to share the learning with pertinent stakeholders. At the Mission level, project-wide evaluation dissemination sessions can be organized to share learning with senior management and technical teams. The Program Office can facilitate this session in consultation with Project Managers or AOR/CORs and Technical Offices. This type of session would be another platform for project/activity level decision making, as important insights might come out of discussions which could be useful for existing and new projects/activities. 

Recommendation tracker of Evaluations: A collaborative approach should be in place between Program Office and Technical Offices to ensure that the recommendation tracker functions in an effective and timely manner.  The Program Office can nominate a staff member as a Point of Contact (POC) for a particular evaluation recommendation tracker to work closely with the AOR/CORs or Technical Offices to follow up on the actions suggested in the tracker and agreed by Technical Offices.

CLA Approach in Critical Assumptions, Feedback Loops, and Institutional Processes

Tracking Critical Assumptions/Risks/Game Changers

Many Project/Activity MEL plans could benefit from including a set of context indicators or complexity aware monitoring tools in order to ensure that the overall contextual landscape of the Project/Activity is monitored. This would help us track our critical assumptions and risks periodically, as well as capture any game changers that can have unintended consequences on outcomes. In this respect, Project Managers, AOR/CORs, and Implementing Partners can employ different tools, such as regular field visits, focus group discussions, before- or after-action reviews, and other pause and reflect methodologies to collect qualitative stories. Project Managers, AOR/CORs and Implementing Partners could organize grassroots-level stakeholder meetings with beneficiaries, teachers, local leaders, journalists, etc. (as relevant to the sector) at least quarterly to understand the changes of context. In 2016, CARE presented a participatory performance tracker at a conference organized by USAID that can guide the development of context specific community tools to gather contextual knowledge. The outcomes of these meetings and context monitoring related qualitative stories can be reflected in quarterly and annual reports. Moreover, at the activity level, AOR/CORs can also hold regular learning and sharing meetings with other donors with which the project or activity is collaborating. These learning meetings can potentially inform the status of ongoing collaboration including the challenges faced as well as opportunities to expand the existing collaboration. At the project level, the Mission can hold quarterly project learning meetings where Project Managers and AOR/CORs discuss issues related to performance, including theories of change, critical assumptions, and overall implementation and management. 

Establishing Feedback Loops: A Tool for Learning and Adaptive Management

Establishing strong feedback loops is important to capture systemic learning. It is helpful for Project and Activity MEL plans to explain how the feedback loops will be connected to overall performance management. In this regard, the feedback loops can be highlighted in any diagram of MEL activities and data collection flow charts that would demonstrate how they would continuously provide information that contributes to performance and data management. MEL experts can also set up digital feedback loops such as online platforms or manual feedback loops such as feedback after a training/intervention.  It can also be anonymous, such as setting up 'feedback boxes' in different hotspots in the field so that stakeholders can freely provide feedback. It is important to put mechanisms in place to ensure that relevant feedback flows to the decision makers at the Implementing Partners, AOR/CORs and Project Managers. In this connection, USAID Missions can learn from USAID/Uganda's feedback loop for real time adaptation.

CLA Practices in the Monitoring, Evaluations and Learning Mission Orders and MEL Working Group

Institutionalizing CLA practices in performance management requires reflecting them adequately in any Monitoring, Evaluation and Learning Mission Orders along with Project and Activity level MEL plans. The Mission Order would include overarching common principles on CLA practices that would in turn guide Project Managers and AOR/CORs to integrate CLA approaches into their Project and Activities and respective MEL plans. In this respect, a mission-wide working group on MEL can be formed, which can help implement the Mission Order and sustain good CLA practices in monitoring and evaluation. Currently, the Bangladesh Mission has a functional MEL working group comprised of M&E professionals from Technical Offices and the Program Office. The working group provides a platform for discussing M&E related issues. The working group plans to incorporate a strong 'L' component in its work through revising the existing M&E Mission Order and finding CLA Champions in the Mission.

Conclusions

Incorporating CLA practices into performance management is an evolving process. It is true that many of the recommendations provided in my three blog pieces might not work in all contexts, each of which might have realities requiring a set of different practices.  I hope these blog posts will stimulate further discussions in the area of CLA in performance management that will enable us learn from each other's experiences and apply the same in our respective contexts. 

There is no such thing as a dumb question!

May 3, 2018 by Guy Sharrock, Jenny Haddle, Dane Fredenburg Comments (0)
COMMUNITY CONTRIBUTION
According to Carl Sagan, in his 1997 book, The Demon-Haunted World: Science as a Candle in the Dark, there are naive questions, tedious questions, ill-phrased questions, questions put after inadequate self-criticism. But every question is a cry to understand the world. There is no such thing as a dumb question.”

In our earlier blog (see Adapting: Why Not Now, Just Do It!) we described how one multi-year Development Food Assistance Project entitled United in Building and Advancing Life Expectations (UBALE) was finding ways, with support from USAID/Food for Peace (USAID/FFP), to implement the notion of ‘adapting’. In conjunction with implementing partners, Save the Children, CARE and CADECOM, Catholic Relief Services (CRS) is aiming to deliver support to 250,000 households struggling to sustain their livelihoods in the most food-insecure region of Malawi.

Asking questions: a fundamental skill

Asking questions and seeking answers is vital for learning, accountability and high performance. It seems to us – through our work with UBALE on evaluative thinking – that asking thoughtful questions is a fundamental skill that is required by everyone engaged in CLA.

There are three elements that seem worthy of note (probably many more, but three will do for now!):

  1. Feeling safe enough to speak up and ask questions
  2. Developing and sustaining the habit of respectfully asking questions
  3. Ensuring there are processes to address questions

In this blog, we will address the first two elements; in a related blog, Adam Yahyaoui and Mona Lisa Bandawe will describe a process that UBALE has recently undertaken to refine and package some critical learning questions that will be advanced over the course of this year.

It’s okay to ask questions

Evaluative Thinking: Critical thinking applied in the context
of monitoring, evaluation, accountability and learning

As individuals, we sometimes feel that if we ask questions, our supervisor, colleagues and peers may consider us negative or intrusive or worse still, ignorant or incompetent. This stops us from flagging concerns about our program performance, or allowing ourselves to have a different opinion from the majority view. Let’s be frank: it’s just easier not to ‘rock the boat’.

This challenge is not confined to a specific project or program, country, region or culture, nor to any work setting, whether it be government, non-profit or private sector. It is not even a novel concern: according to Kofi Kisse Dompere, there is a traditional African thought suggesting that, “No one is without knowledge except he who asks no questions.”

So, too often the so-called enabling environment for those who wish to ask questions can feel disabling or, at the very least, not hugely supportive. In her excellent TEDx video, Professor Amy Edmondson, opens with three vignettes illustrating different scenarios when an individual’s desire not to want to look dumb overcame the need to ask a question. She suggests that this can matter because, “it robs us, and our colleagues, of small moments of learning.” She proposes three things that can help to build a ‘psychologically safe’ office climate:

  1. Frame the work as a learning opportunity, not merely an activity to be completed. In a complex setting, such as the one in which UBALE is intervening, there are a lot of interventions for which it is not possible to know in advance what will be the outcome, nor what will be any unintended consequences, good or bad, at least not with absolutely certainty. It is this uncertainty, and the systemic nature of the setting, that justifies those involved to see each activity as a learning event. In Edmondson’s words, this “creates the rationale for speaking up.”
  2. Admit to your own shortcomings, as you surely can’t have a monopoly on wisdom! You cannot know everything in advance, you will miss things, particularly when operating in a complex setting where there are so many moving parts. So, for the task or activity to be performed to a high standard, you need the help of your colleagues and partners. This “creates more safety for speaking up,” according to Edmondson.
  3. Encourage lots of questions by modeling this yourself, and encouraging others similarly. This makes it essential for staff to speak up.

Developing the ‘questions’ habit

While it is a critical element, ensuring that the working environment is ‘psychologically safe’ is, on its own, insufficient to achieve high-quality CLA. It is equally important that staff know how and when to ask questions in a respectful manner.

Let’s assume senior managers have ‘bought-in’ to the importance of psychological safety, and start asking lots of questions; their aim is to encourage their colleagues and subordinates to follow suit. But this may not come naturally, or easily to those whose behavior they are seeking to change. Among our evaluative thinking resources, we suggest types of questions that help you know when evaluative thinking and learning is happening:

  • Why are we assuming X? What evidence do we have?
  • What is the thinking behind the way we do Y? Why are we not achieving Y as expected?
  • Which stakeholders should we consult to get different perspectives on X? and so on.

In the early part of our capacity strengthening work with UBALE a good amount of time was spent on this topic, both question generation, and practice in asking them. It was apparent that some colleagues found it easier than others to acquire and apply the skill; however, with time and practice, UBALE staff demonstrated that everyone has the capacity to ask questions that contribute to improved project learning. Our implementation intention should be to make it a habit!

We are planning to trial a couple of ideas arising from our recent work with UBALE to instill a habit of question asking, especially in field staff:

  • Working with staff to develop portable ‘flash cards’, each containing a question that can unlock a new line of inquiry, and
  • Bringing greater intentionality and being more systematic through developing checklists or question prompt lists that will help staff avoid any unwitting blind spots as they develop the ‘asking questions’ habit

Three key CLA lessons

  • Asking questions implies organizational change. Things are different with CLA, or at least they should be. Adopting a CLA approach implies that an organization is committed to becoming a true learning organization in which processes for asking and discussing questions are embedded in all operations. This obliges the right kind of enabling conditions.
  • Asking questions is critical to CLA. The Nobel laureate physicist, Richard Feynman, wrote, “I would rather have questions that can't be answered than answers that can't be questioned.” If monitoring data appear to suggest some variance between expected versus actual achievements, it is important to ask why, and what are the implications for project activity. This necessitates asking questions to deepen understandings of what is happening, and an openness to adapting earlier thinking. This requires appropriate processes and tools.
  • Asking questions requires a certain kind of staff. CLA necessitates staff who are, in the words of David Garvin and Amy Edmondson, “tough-minded enough to brutally confront the facts; to talk directly about what works, and what doesn’t work. It’s about being straightforward.” This must be conducted in a way that respects other people and their perspectives. This requires new staff skills.

F

Learning & Adapting to Combat HIV/AIDS in Uganda

Apr 30, 2018 by Maribel Diaz Comments (0)
COMMUNITY CONTRIBUTION

This blog post has been cross-posted from Social Impact's blog. Maribel Diaz is a Technical Director with Social Impact.

The sciences are well known regarding guidelines and protocols for HIV/AIDS treatment. Governments and donors are active in addressing the epidemic through allocating resources and setting targets. Yet, there is still much to gain from pausing to learn from the data and from service providers and adapting to realities on the ground. There are qualitative ways to address real-time findings in data and adjust implementation.

USAID calls this approach Collaborating, Learning and Adapting (CLA). CLA facilitates cooperation among key players. It generates learning based on real-time data and collective solutions. With CLA, new ways to implement HIV/AIDS programming can be adapted, tested, and analyzed and potentially be scaled-up rapidly.

In Uganda, the Strategic Information and Technical Support (SITES) activity is supporting PEPFAR donors to examine data for analysis and make decisions based on the changing face of the HIV/AIDS epidemic. Social Impact leads the Collaborating, Learning and Adapting (CLA) team which facilitates learning events with PEPFAR implementing partners.

HIV/AIDS in Uganda

PEPFAR has an ambitious goal for addressing the HIV/AIDS pandemic in priority countries. Referred to as the 90-90-90 objectives, the aim is to have 90 percent of people living with HIV know their status, 90 percent of people who know their status are accessing treatment, and 90 percent of people on treatment have suppressed viral loads.

Recent data from Uganda showed a lag in moving towards these targets. There was a large disparity between newly tested positive patients and the number of those new patients on antiretroviral therapy (ART) treatment.

Learning from the data

To address this lag, we designed a problem-solving learning event in response to findings on the PEPFAR TX_NEW indicator. TX_NEW tracks the number of adults and children newly enrolled on ART. It is expected that the characteristics of new clients are recorded at the time they newly initiate life-long ART. But the data showed this wasn’t happening as expected.

SI structured this event using tools to initiate a discussion with stakeholders, particularly a large multi-service implementing partner, RHITES EC working within 11 districts in East Central (EC) Uganda.

We engaged stakeholders in an organizational development/adaptive management exercise to undertake a process of “Head, Heart, Hands.” The group used their heads to examine real-time data. They took the information to heart to create meaning and propose potential solutions. They created action plans to have in their hands to apply solutions and adapt processes.

Understanding root causes

We presented data showing the current challenges in the continuum of response to the HIV/AIDS epidemic. We led a root cause analysis exercise to help participants examine what was keeping new patients from starting on ART. They identified: poor counseling skills and inadequate psychosocial support for newly tested patients; poor incentives for health workers; outreach activities not targeting the right at-risk populations; inadequate counseling pre-post testing; health workers not taking lead in testing.

Identifying solutions

Discussion groups prioritized five leading drivers and proposed potential solutions. Participants identified short-term processes that can easily be adapted in addition to longer-term motivational benefits for staff. For example, including staff in health facility planning will result in an improved vision for service delivery and a collective sense of urgency to address problems. Additional training for village health teams and expert patients in counseling can help bridge the gap between testing and enrollment in ART. Other simple but important improvements led participants to realize that there are solutions within their locus of control.

Planning for the future

Finally, we led an action-planning exercise. The participants recorded proposed solutions to the five leading drivers and identified the responsibilities among specific Implementing Partners and District Health Staff and decision-makers to address them.

In addition to action-planning, participation in the learning event provided tools for IPs and District Health staff to take back to their respective organizations and service-delivery sites for future problem-solving based on data.

Using CLA is helping make a difference in the fight against HIV/AIDS in Uganda.

Can a Competition Create Space for Learning? Three Design Factors to Consider

Apr 20, 2018 by Frenki Kozeli Comments (1)
COMMUNITY CONTRIBUTION

three children

Development practitioners are often innovating, piloting, and problem-solving — but sometimes these initiatives have a hard time getting disseminated past the project annual report.  At Chemonics, the Economic Growth and Trade Practice and the Education and Youth Practice joined forces to kick off 2018 with the launch of our Market Systems and Youth Enterprise Development Innovation Contest, an endeavor designed to spark knowledge-sharing between our projects and give our staff an easy opportunity to learn from one another.

We asked our global workforce to share the models and methodologies they use for market systems development and youth enterprise development. The incentive? The opportunity to share their work with their peers and industry leaders through remote learning events, publications, and in-person attendance at leading development conferences. We used an off-the-shelf online contest platform which could be easily accessed in real time by our projects around the world and opened the submissions to evaluation by expert panels and peer voting. Over the six-week contest period we received entries from Europe and Eurasia, Latin America, the Middle East, Asia, Eastern Africa, and Southern Africa. The reach of the contest and the enthusiasm of our staff was invigorating to experience.

With the contest itself behind us, we are taking a moment to reflect on what made this initiative an effective learning event and what we, as designers, could share with colleagues looking to launch their own. Here are three factors to consider:

1. Leave Your Preconceptions at the Door

When harvesting knowledge from the field, try to leave your preconceptions behind and open the initiative up to as many participants as possible. We assumed our competitiveness projects would account for most of the submissions — an assumption that turned out to be incorrect. Opening the contest to our entire global workforce broadened the diversity of the projects represented. We heard from energy projects working in youth workforce development, peace and stability projects convening young entrepreneurs, and enabling environment projects taking a market systems approach to women’s economic empowerment.

2. Build In a Moment for Reflection

We wanted this contest to be an opportunity for our staff to take a “pause and reflect” moment, so we integrated learning into the design of the contest. We asked our projects to tell us about what was unexpected, what went wrong, how they adapted, and what their path to scale and sustainability would be moving forward. The result was that we weren’t hearing about success stories, but about process and methodologies and adaptation, a true reflection of project implementation in the dynamic environments where we work.

3. Don’t Stop the Momentum of Sharing

Our global workforce responded strongly to the opportunity to share their experiences with their colleagues around the world. To build on this momentum, we’ve sponsored learning events for our winning teams in the field to discuss the models featured in the contest and their adaptability to different contexts. We’re organizing webinars throughout the year so our winning teams can share and discuss their models with our global workforce, and, more importantly, so we can promote project-to-project learning and collaboration. And finally, we’ll be bringing representatives from our winning projects to Washington, D.C. to attend the SEEP Network Annual Conference and the Global Youth Economic Opportunities Summit, leading industry events for market systems and youth enterprise development, to enhance our learning and collaboration with the development industry at large.

By now, you might be asking yourself who these mysterious winners are. Stay tuned in the next few weeks as we share the winning market systems and youth enterprise development models from Uganda, Moldova, Pakistan, and Ukraine, featuring a range of creative solutions — from motorbikes to river barges and robotics to school buses.

This blog has been cross-posted from MarketLinks. Frenki Kozeli is a manager on Chemonics’ Economic Growth and Trade Practice.

 

Qualitative Visualization: Chart choosing and the design process

Mar 29, 2018 by Jennifer Lyons Comments (0)
COMMUNITY CONTRIBUTION

In order for data to be used for learning and adapting, the data itself needs to be easily accessible. Evaluators and researchers have been hungry for resources on how to effectively present qualitative data, so last year Evergreen Data launched a qualitative reporting series. And, we recently released an updated qualitative chart chooser. In this post, I’ll explain how to use this tool and share examples of how it can be used.

Chart Chooser 3.0

We built this tool to be relevant for all levels of qualitative data use. Whether you only collect qualitative data as an open-ended question attached to your quantitative survey, or you are doing full-blown qualitative research, this handout will hopefully provide you with some new visualization ideas. Along the top of the table, you have the option to quantify your qualitative data in the visual. In some cases, quantification can break down a bunch of qualitative findings into a simple yet effective visual like a heat map. On the other hand, when you quantify the data, you risk losing context and the personal nature of qualitative data.

Next on the chart chooser, it is broken down by what you want to include in your visual-just highlight a word/phrase or include a higher level of analysis. Along the left-hand side of the chooser, you can see another breakdown depending on the nature of your data; like whether it represents flow, comparison, hierarchy, etc. Last, all the chart and visuals (along with cute little illustrations) are suggested as options. You know your audience, your data, and your story, so use this chart chooser to pick the best visual to fit your context.

The design process:

Let’s put this chart chooser to use! Let’s say that you are working with the homelessness service community in your area. Using a mixed methods approach, you have collected data on the homelessness system including the causes and the continuum of services available in your community. You are writing a 10-page report on the findings, but you want to summarize the causes and continuum of services available in your area. You are not looking to quantify the data because you want to give specific program examples. After taking time to look at the chart chooser, you decide that there is a flow to the nature of your data, so using a flow diagram will be the best fit.

Road Map 1

Producing quality data visuals is not just about choosing the right chart, you need to layer the right chart with quality design technique. Let’s look at how a flow diagram would look without putting much effort into crafting a design that tells a story. This (right) took me about 5 minutes using PowerPoint smart art.

The problem is this visual doesn’t tell a compelling story about the journey of homelessness and the services offered at each point in the continuum. Let’s reframe this visual to better showcase the journey.

This (right) is getting better! I can start to see the story develop. This still needs some love like the intentional use of color, an effective title, and a more personalized touch.

Images of people and quotes are taken from a CBS News report (https://www.cbsnews.com/pictures/before-and-after-from-homeless-to-hopeful/) on the 100,000 Homes Campaign.

This (above) is starting to look like something to be proud of! It is a piece that can be shared separate from the 10-page report and it summarizes your community’s journey of homelessness. This was all made in PowerPoint using textboxes, lines, photos, and square shapes.

I can imagine that the references to the different programs could have embedded bookmark links to the section in the report where they describe and talk more about that program. If this were to be posted online, the programs could link to websites providing more information and resources on each of the programs. Pushing the idea of using color intentionally even further, color used in this diagram should be threaded throughout the entire report. This is one of my favorite techniques! It helps chunk up a long, mixed methods report into bite-sized pieces that the brain can better interpret.

To keep up-to-date on more qualitative reporting ideas, follow our qualitative blog thread.  For any questions or comments, you can reach me at jennifer@stephanieevergreen.com.

Theory of Change: It’s Easier Than You Think

Mar 13, 2018 by Kasia Kedzia Comments (0)
COMMUNITY CONTRIBUTION

This blog is cross-posted from Chemonics.

Kasia Kedzia is the director of Chemonics’ Monitoring, Evaluation, and Learning Department.

Imagine you are chronically late to work. If your goal is to get to work on time, you may have identified multiple reasons for being late. If you only consider one of these reasons and don’t identify root causes, which are simply other reasons for your lateness, you will continue to be late. One reason may be traffic, but another may be leaving your house late, or not getting to bed on time.

Similarly, in the development work we do, if our solutions don’t match the root causes of the problems, our work will fail to make a difference and the consequences can be staggering. Creating and applying theories of change (ToCs) can help us be more thoughtful, deliberate, and effective in our interventions. However, there are two major misconceptions that keep us from systematically and continually applying theory of change to our work.

1. Theories of Change Are Too Complex

Theories of change are actually simpler than you think.

The ToC is just a road map. It’s the articulation of how and why a given set of interventions will lead to specific change. It follows a generally straightforward “if/then” logic — if the intervention occurs successfully then it will lead to the desired result. Of course, behind that logic is a set of beliefs and assumptions that support our expectations about how change will occur.

Here is a simple activity to create a theory of change. Take out a few sticky notes and have your team jot down the major problem and the contributing problems as separate statements, each on their own sticky note. Order them following the “if/then” logic. Write down corresponding solutions on separate sticky notes and order them in the same fashion. If each team member does the exercise separately, you can come together and check your logic. Does each members’ order correspond with yours? Or does the order of your solution statements not match your problem statements? If the answer to these questions is no, it can be a red flag to reexamine your logic.

The reason why ordering the sticky notes can be so important is that it can expose your assumptions. We assume that each of us would follow the same causal logic to get to a given solution, but that’s not a given. Articulating our theory and stating our assumptions out loud can go a long way to helping flesh out our path.

The development problems we grapple with are complex and can lead to complex theories of change that exacerbate the myth that theories of change difficult to understand. Yet, the purpose of a ToC is to get at and articulate the most important steps for the greatest level of impact, so ideally it should be simple. Historically, USAID began the practice of drafting and passing on a theory of change in response to turnover at USAID missions around the world. It was an attempt to ensure that the approach didn’t change with every changing of the guard. The simpler and easier it is to follow a ToC, the easier it is for others to carry on the work, monitor progress against the ToC, and adjust accordingly.

2. Theories of Change Are a “Nice to Have,” Not a “Must Have”

In addition to being simpler than you think, ToCs are also more important than you might realize. When you run through your causal logic, you can get to the root of the problem and identify corresponding solutions for each problem. When you order your problem statements, you form your theory. You can than follow the pattern by also ordering the corresponding solutions.

In the causal path above, if some of the root problems were not identified, such as staying up late watching Netflix or not being able to fall asleep due to coffee habit, it would be easy to jump to the possible wrong solution. If we stop at traffic or waking up late and assume that is our cause, our potential solution won’t result in getting to work any faster. The path above is also very different from this second causal path, demonstrating that two individuals often approach the same problem differently but may not realize it until mapping out their assumptions through a ToC.

Making similar mistakes on a development program can lead to much worse consequences. Even taking a few moments to stop and articulate our assumptions and some very straightforward causal logic can help shift our approaches. When we have a good ToC in place that is plausible, feasible, and testable, we are better able to not only articulate what we think success will look like but how we will know if we are on track over time. We will have a theory around what the project is “supposed to look like,” a strategy to accomplish it, and a plan to know if our intervention is working as the project goes on.

A Simple Yet Effective Tool

When mapped out a theory of change, we can walk someone through what project success looks like, how it’s supposed to work, the strategy we are taking to accomplish our goal, what we want to learn, and how we will know if it’s working. Contrary to popular belief, the point is to keep the ToC simple. Above all, when we apply it we may be preventing some major potential pitfalls from the start. At most, we will have greater impact through our work. At the least, we will get to work on time.

Adaptive Learning? Liberia Can Teach the World a Thing or Two!

Feb 20, 2018 by  Comments (0)
COMMUNITY CONTRIBUTION

This blog was cross-posted from iCampus.

Image from a Learning Event in LiberiaIn the summer of 2017, the Accountability Lab Liberia and iLab Liberia teams conducted a learning mapping to understand why and how Liberian organizations learn. We think adaptive learning is important in Liberia because development is a complex and dynamic process where interests, relationships and incentives can change rapidly. Adaptive learning techniques help us reflect on why we do what we do; they help us improve what we do; and, hopefully, they help us find politically-savvy ways to sustain positive change across the country.

Our learning mapping revealed the frameworks and the concepts of organizational learning are not yet well understood or used to guide strategic or programmatic practices in Liberia by either Liberian or international organizations. Many local organizations have poor goal-setting procedures, weak data collection systems, and low monitoring and evaluation capacity. Many international organizations are much better at collecting information but are often unable to integrate learning activities into grant-making and program implementation- they largely share learning internally with colleagues rather than externally with other organizations, and can have limited ability to adapt activities rapidly based on data collected.At the same time, we also found that there was not a “learning community” in which staff responsible for these activities could themselves learn and share ideas. We recently organized, therefore, Liberia’s 1st Learning Conference with our partners at USAID LAVI to do three things. First, to share knowledge around the idea of organizational learning among those responsible for monitoring and evaluation, data collection and programmatic decision-making in Liberian organizations. Second, to share ideas around learning and learn from each other about how we can better collect data and use information to allow for adaptation and improvement in our work. Third, to build on the nascent learning community within Liberia, to develop out a network of engaged, connected learning advocates from across government and civil society. So what did we learn from the learning conference?! A few key takeaways. In Liberia, there are:

i) The beginnings of a shared definition of learning…Without a collective understanding of what learning means it is hard to think through how best to make it happen, both within and across organizations. At the conference we were encouraged by the areas of overlap we saw across organizations in terms of their understanding of learning. Almost all groups indicated that they understand learning not just as a process of collecting information but actually using that information to change. Collectively we defined adaptive learning in the Liberian context as the extent to which organizations engage in learning activities to capture, save and share lessons; and use this knowledge to adapt their activities.

ii)But common challenges to learning. Almost all the participants at the conference indicated that there are some key shared barriers to learning. For example, very few organizations actually collect the data they need in Liberia to provide the basis for learning. The incentives within organizations are also not aligned in a way that supports learning. The emphasis for local organizations tends to be data collection against pre-defined frameworks set by donors rather than in a flexible way that would allow for improvement. Finally, M&E and learning tend to be compartmentalized and allocated to 1 specific staff member rather than understand as a mindset that should infuse organizations from top to bottom.

Image from a Learning Event in Liberiaiii)And some fantastic learning innovations. The constraints to learning are real in Liberia but it was fantastic to hear about some of the ways that organizations are learning despite the challenges. For example, one learning organization has actually developed its own internal portal to document all successes and failures and synthesize information about each. Another creates short monthly summaries of successes, surprises, and disappointments from every employee. Another makes sure to conduct surveys in communities about their work only on Saturdays when people will be at home to respond, and in local languages that make sure feelings can be fully expressed. All of this takes more time and energy, but the learning that comes out of it is far more valuable as a result.

iv)With diverse forms of communication around learning. The group agreed that long, written reports are not the best way to build communities around learning. We need to move away from traditional approaches to dissemination and think through how to engage diverse audiences. Ideas that are actively being used in Liberia for this include: infographics, blogging, social media, learning calls, podcasts, “pause and reflect” moments, video interviews, whatsapp groups, fail faires, radio shows and in-person learning exchanges, among others. There are plenty of ways we are already communicate learning- the key is to move beyond the tyranny of Microsoft Word!

v)And with the recognition that learning is political. It is clear that there is a subset of civil society in Liberia that is actively engaged in learning. Although progress is nascent, there is interest in further knowledge and understanding and the energy for further collaboration. The group reflected that this energy is not always reflected within government, particularly given the recent political transition. This brought up larger issues about the political nature of learning- who gets to learn and why; the power dynamics behind it; and where political incentives lie to actually ensure that information is translated into learning. Too often we think about learning as a technical process, but in Liberia as in all contexts, it is political- and we must remember that.

More information about the event along with photos and videos are available on the iCampus Facebook page and Twitter account, and you can listen to a podcast of the “fireside chat” session of the event with Dr. Tanya Garnett from Liberia Strategic Analysis here.

Video: RTI International's CLA Sprint

Feb 8, 2018 by Amy Leo, Molly Chen Comments (0)
COMMUNITY CONTRIBUTION

Collaborating, Learning and Adapting (CLA) Challenge Week prompted RTI International's International Development Group to convene their Monitoring, Evaluation, Research, Learning and Adapting (MERLA) Community of Practice to share how members are implementing CLA in their projects and divisions.

Molly Chen, MERLA Specialist with RTI International's Global Health Division, International Development Group says: "RTI's International Development Group formed a Monitoring, Evaluation, Research, Learning and Adapting (MERLA) Community of Practice this year and we've challenged ourselves to develop and nurture a culture of learning across our organization. During the CLA Challenge Week, the MERLA Community of Practice members met to share how we are implementing CLA in our projects and divisions. We heard from MERLA specialists helping to develop learning agendas for their projects, disseminate research findings with the goal of adapting these learnings back into the project, and organizing pause and reflect moments for our team of knowledge management experts to share their experiences."

Watch RTI International's CLA Challenge Week video to learn how MERLA Community of Practice members are integrating CLA in their work:

Molly reports that going forward, "the RTI MERLA Community of Practice will continue to take on the challenges we've set and will share these experiences through different mechanisms of knowledge sharing such as Yammer, SalesForce, and a potential blog post on our Medium blog!"

What is Adaptive Management?

Feb 8, 2018 by Learning Lab Comments (1)

USAID Program Cycle - Adaptive ManagementCurious about how USAID defines and thinks about adaptive management? Well, USAID recently released a Discussion Note which complements ADS 201.3.1.2 Program Cycle Principles by elaborating on Principle 2: Manage Adaptively through Continuous Learning. The Discussion Note is intended for USAID staff interested in learning about recent and promising practices in adaptive management across the Program Cycle. 

USAID’s work takes place in environments that are often unstable and in transition. Even in more stable contexts, circumstances evolve and may affect programming in unpredictable ways. For its programs to be effective, USAID must be able to adapt in response to changes and new information. The ability to adapt requires an environment that promotes intentional learning and flexible project and activity design, minimizes the obstacles to modifying programming and creates incentives for managing adaptively. 

Adaptive management is defined in ADS 201.6 as “an intentional approach to making decisions and adjustments in response to new information and changes in context.” Adaptive management is not about changing goals during implementation, it is about changing the path being used to achieve the goals in response to changes. Like other donors and development organizations (see, for example, the following initiatives: Doing Development Differently, Problem-Driven Iterative Adaptation, Thinking and Working Politically, and The World Bank’s Global Delivery Initiative), USAID is increasingly recognizing the importance of adaptability for its work to be effective. ADS 201 now integrates adaptive management approaches throughout the Program Cycle. 

“Manage adaptively through continuous learning” is one of the four core principles that serve as the foundation for Program Cycle implementation. 

This Discussion Note is organized around the phases of the Program Cycle (strategy, project, and activity design and implementation; monitoring and evaluation; and learning and adapting); While the adaptive management approaches described here are examples of initial entry points associated with a specific phase of the Program Cycle, many of these approaches lead to adjustment in other areas. The note concludes with sections on enabling conditions and a description of the skills and attributes of adaptive managers.

Click here to read more!

Pages

Subscribe to RSS - Learning Lab's blog