Vous êtes sur la page 1sur 2

Policy Brief January 2011

Evaluating for Effective


Development Assistance
Problem Recommendations & Actions
Understanding what InterAction welcomes the prominence given to evaluation and evidence-based decision-mak-
works and why is crucial ing in the Presidential Policy Directive on Global Development, the Quadrennial Diplomacy and
to the success of U.S. Development Review and the draft Global Partnerships Act of 2010. The U.S. Government
development assistance should prioritize development assistance evaluation so it can strengthen our work with develop-
and to achieving the ing country governments and civil society to design and undertake cost-effective interventions
most cost-effective that build their capacity to address their development needs and thus contribute to lasting
use of limited aid development change. The U.S. Government must promote a range of evaluation approaches
resources. A number that suit different contexts, interventions and evaluation questions, as well as provide guidelines
of challenges inhibit for evidentiary rigor that address a range of situational circumstances and anticipated changes.
the U.S. Government’s • Support USAID’s efforts to reinvigorate its evaluation function and strengthen interna-
ability to achieve and tional development evaluation practices.
demonstrate results, • Prioritize evaluation within development assistance by:
including: (1) project »» Setting aside a minimum portion of program budgets for evaluation purposes;
timeframes not aligned »» Requiring USAID missions to reserve a portion of their funding to conduct evaluations
with the time required across portfolios of related projects to promote learning in areas of strategic interest;
to meet development »» Focusing on outcome-level changes; and
outcomes; (2) reporting »» Changing incentives to reward those who learn from their interventions, rather than
requirements that those who report positive results.
focus more on how • Make evaluation responsive to the host governments, civil society organizations and
money is spent than people with whom we work by mandating their inclusion as partners in evaluation design,
on lives changed; implementation, dissemination and use.
(3) placing donors’ • Ensure evaluation methods and notions of rigorous evidence are well-suited to the con-
demands over local texts and questions raised by the evaluation.
priorities for long-term • Where security concerns allow, present evaluation findings transparently and publicly
changes; (4) reluctance in-country and in the United States.
to acknowledge
development’s Results
complexity, take risks
Greater commitment to evaluating development programs, and to evaluating
and learn from failures;
them in the appropriate ways, reap positive results including:
and (5) insufficient
• The U.S. Government, through USAID, will emerge as a leader in development pro-
funding for evaluation.
gram evaluation and be enabled to undertake a more effective, efficient and respon-
sive U.S. foreign assistance program.
• Development assistance implementers will be able to reach a larger number of peo-
www.InterAction.org ple and contribute to better outcomes with the limited resources available. In more
concrete terms, this means having an aid program more responsive to the govern-
1400 16th Street, NW
Suite 210 ments, communities and individuals with whom we partner, and more capable of
Washington, DC 20036 producing lasting improvements in people’s lives.
202-667-8227
Background be successful. Fear that negative evaluation results
will undercut support for aid (or funding for a specific
organization) means evaluation findings are not widely
Within USAID, steps are already being taken to rebuild the
shared, limiting opportunities to correct and learn from
agency’s evaluation capacity. In June 2010, for example,
failures.
USAID established the Office of Learning, Evaluation and
• Demand for quick results: Development takes time.
Research within the new Bureau of Policy, Planning and
Yet organizations are often asked to report on impact
Learning. The new office promises to elevate the promi-
over time periods too short for that impact to be
nence of results measurement and the use of evidence in
achieved, much less evaluated. While some meaning-
USAID. This same office is working on a new Evaluation
ful impacts, such as increases in agricultural yields or
Policy (due for release in January 2011) that will articu-
patients treated, can be measured within a relatively
late expectations regarding evaluation practices, with
short time period, others, such as increases in income
an emphasis on higher methodological standards, trans-
or improvements in health status or learning, may take
parency and utility performance and impact evaluations.
years to materialize.
USAID is also developing new evaluation training programs
• Inadequate funding for evaluation: Monitoring and
for program managers and evaluation specialists. In addi-
evaluation-related budget items are among the first to
tion, through an interagency process, it is actively engaged
be cut when budgets need to be trimmed. This is a sign
in developing the monitoring and evaluation frameworks
that the agency does not prioritize evaluation.
for several presidential initiatives (Global Health, Feed the
Future and Climate Change) in a way that is consistent with There are encouraging signs that some of these problems
a renewed emphasis on impact measurement. will be addressed and that the decline of evaluation within
Evaluation at USAID has an uneven history, marked by USAID is starting to be reversed. The prominence given to
a sharp decline in capacity over the past decade.1 One evaluation and evidence-based decision-making by senior
particularly troubling trend has been the growing emphasis Administration officials is particularly welcome. Setting the
on performance monitoring at the expense of evaluation, tone for all agencies involved in development assistance,
resulting in a decline in the number of evaluations con- the Presidential Policy Directive on Global Development
ducted and too much attention paid to collecting data on calls for the rigorous evaluation of the impact of policies
large numbers of low-level indicators. Much of the remain- and programs, and for the results of such evaluations to
ing evaluation work is done by implementing partners, drive policy and budget processes. In addition, the direc-
preventing USAID from gaining the experience and advan- tive calls for more “substantial investment of resources in
tages that stem from undertaking such work. In 2006, the monitoring and evaluation.”2
Center for Development Information and Evaluation, which Previewing the contents of the Quadrennial Diplomacy
had been charged the previous year with leading an initia- and Defense Review, Secretary of State Hillary Clinton set
tive to revitalize evaluation within USAID, was abolished. out a vision of USAID as an agency in which decisions are
These and other factors have resulted in several prob- “based on hard evidence to ensure that investments deliver
lems that must be addressed for development assistance results” and which measures success on the basis of
to be successful. These include: improvements in people’s lives rather than on “the number
• Focus on outputs rather than outcomes: Evalua- of programs run.”3 She also reinforced statements made by
tion at USAID has suffered due to the overriding focus USAID Administrator Rajiv Shah about being transparent
on tracking how money is spent, rather than on what about successes and failures and learning from them.
results aid has achieved. Too often, the focus has been
on measures such as the number of schools built or
teachers trained, rather than on the ultimate objective
of such activities—improvements in children’s learn-
ing. Furthermore, these measures are often decided in
Washington, with insufficient input from those actually
overseeing, implementing or affected by programs in
the field.
• Aversion to risk: Just as no one would expect every 2 “Fact Sheet: U.S. Global Development Policy,” September 22, 2010.
business start-up to become profitable, it is unrea- http://www.whitehouse.gov/the-press-office/2010/09/22/fact-sheet-
sonable to expect every development program to us-global-development-policy (accessed October 27, 2010)
3 Hillary Rodham Clinton, “Leading Through Civilian Power: Redefining
American Diplomacy and Development,” Foreign Affairs, November/
1 Andrew Natsios, “The Clash of the Counter-bureaucracy and Develop- December 2010. http://www.foreignaffairs.com/articles/66799/hillary-
ment,” Center for Global Development, July 2010. rodham-clinton/leading-through-civilian-power

Vous aimerez peut-être aussi