You are here

Summarizing Evaluation Evidence

Quickly summarize the main findings of all of the potentially relevant evaluations to help ensure that what is most important catches the attention of a CDCS development team.

Locating evaluations does not ensure that the evidence they contain will be utilized. CDCS development team members, busy with other tasks, may not all find time to read them. Having one person summarize the most important findings of relevant evaluations can help a whole team.

For this purpose, a simple optional evidence template rather than a long report may be useful for responding to the CDCS requirement to “reference the evidence that supports the causal linkages” in the Development Hypothesis section of a CDCS. If evidence is provided in a table of this sort, include a URL link to the each report cited.

Evaluation Evidence Summary

Evaluation Evaluation of Trade Hubs Located in Accra, Ghana; Gaborone, Botswana; and Nairobi, Kenya. Global Business Solutions, 2006
Specific Evidence The Trade Hubs throughout Africa met or surpassed their targets, despite having different performance-based contracts and terms of reference. Trade Hubs, with varying degrees of success, have facilitated intra-regional trade through programs on transport corridors, customs modernization and harmonization, sanitary and phyto-sanitary (SPS) training, etc., and have integrated activities well with different stakeholders when appropriate. Their programs and activities advanced the policy objectives of the AGOA. However, communications strategies in the Hubs were not robust enough to facilitate implementation of multi-faceted, multi-agency programs such as the TRADE Initiative.
Evidence Strength Document reviews; site visits; key informant interviews; online client and partner surveys (responses: 81; response rate 23%); survey instruments and tabulations included.

What is Evidence Strength?

USAID'S interest in bringing high-quality evaluation evidence to bear on decisions about future programs is evident throughout its CDCS Guidance, as well as in the distinction USAID's Evaluation Policy makes between the high quality, or strength, of evidence that can be produced through rigorous Impact Evaluations and the relatively low quality of evidence sometimes found in older USAID evaluations. When evaluation evidence about a particular country, or a development hypothesis, is summarized, it is useful to include a brief note that suggests how strong a particular piece of evidence is, and thus, how seriously it should be taken. A review of an evaluation's methodology will usually reveal the degree to which it used formal social science research methods; the number and types of data sources; the data series used and other elements that affect the quality of the evidence it provides.