Log in
  • Home
  • Evaluating Complex Development Programs: Challenges and Promising Approaches (Presenter: Michael J. Bamberger)

Event Details

Evaluating Complex Development Programs: Challenges and Promising Approaches (Presenter: Michael J. Bamberger)

  • Mon, October 21, 2013
  • 12:00 PM - 1:30 PM
  • Marvin Center, GWU (800 21st Street, NW, Room 413-414)

Registration


Registration is closed

WASHINGTON EVALUATORS BROWN BAG SESSION

Evaluating Complex Development Programs: Challenges and Promising Approaches

Presenter: Dr. Michael J. Bamberger, 
Independent Development Evaluation Consultant

Date: Monday, 21 October 2013
Time: 12 noon to 1:30pm
Location: Marvin Center, The George Washington University
800 21st Street N.W., Room 413-414, Washington, D.C. 20052

Abstract
While most development agencies are moving towards complex, multi-component, multi-donor programs or interventions; there are as yet few widely accepted methodologies for evaluating these programs. Many agencies assume that it is not possible to conduct a “rigorous” causal analysis of the outcomes and impacts of complex programs. This is evidenced by the many agencies that argue attribution analysis is not possible for complex programs and that evaluations can only assess the contribution that a particular agency has made towards a particular development goal. We will discuss the different dimensions of complexity, the unique challenges facing the evaluation of complex programs, and why evaluators have found it so difficult to address these challenges. Many “complex” evaluations continue to apply conventional evaluation designs developed for assessing “simple projects”, many of which do not recognize or address the unique characteristics of complex programs. We then present the outline of a strategy that integrates current best practice for evaluating complex programs. The strategy incorporates many of the emergent social research technologies that draw on the widening applications of mobile phone and other hand-held data collection and analysis devices, the incorporation of big data and techniques for strengthening counterfactual designs.

Bio of Presenter
Michael Bamberger has a Ph.D. in sociology from the London School of Economics. He has 45 years of experience in development evaluation, including a decade working with NGOs in Latin America, and almost 25 years working on evaluation and gender and development issues with the World Bank in most of the social and economic sectors in Latin America, Africa, Asia and the Middle-East. He also has a decade as an independent evaluation consultant working with 10 different UN agencies, the World Bank, regional development banks, bilateral agencies, developing country governments and international NGOs helping design and implement development evaluations, organizing evaluations workshops and producing handbooks and guidelines. He is a member of the International Advisory Panel of the Evaluation Office of UNDP and an advisor to the Evaluation Office of the Rockefeller Foundation. He has published four books on development evaluation, numerous handbooks on evaluation methodology, and articles in leading evaluation journals. He has been active for 20 years with the American Evaluation Association, and has served on the Editorial Advisory Boards of the American Journal of Evaluation, New Directions for Evaluation, the Journal for Mixed Method Research and the Journal of Development Effectiveness. He is on the faculty of the International Program for Development Evaluation Training (IPDET) where he lectures on conducting evaluations under budget, time and data constraints; the gender dimensions of impact evaluation; mixed method evaluation and the evaluation of complex programs. He has also been teaching for over a decade at the Foundation for Advanced Studies for International Development in Tokyo.

His recent publications include: (with Jim Rugh and Linda Mabry) RealWorld Evaluation: Working under budget, time, data and political constraints (2012 Sage Publications); (with Marco Segone) How to design and manage equity-focused evaluations (2011 UNICEF); Introduction to mixed methods in impact evaluation [InterAction Series on Impact Evaluation – scheduled for publication in July 2012]; (with Vijayendra Rao and Michael Woolcock) Using mixed methods in monitoring and evaluation: experience from international development [2010 Sage Handbook on mixed methods in behavioral and social research]; Institutionalizing impact evaluation within the framework of a monitoring and evaluation system [World Bank 2009]

Nearest Metro Station:  Foggy Bottom (orange and blue lines)

Please register or RSVP to Brian Yoder.

(c) 2017 Washington Evaluators

Powered by Wild Apricot Membership Software