Resources

  • publication Jul 2014

    First Among Equals: The Evaluation of the McConnell Foundation Social Innovation Generation Initiative

    This teaching case explores both the real and potential evaluation of a foundation strategy focused on social innovation in Canada. 

    By Susan Parker

    First Among Equals: The Evaluation of the McConnell Foundation Social Innovation Generation Initiative
  • publication Jul 2014

    Evaluating Networks for Social Change: A Casebook

    This casebook profiles nine evaluations of network effectiveness that are designed to fit with how networks develop and function.

    By Madeleine Taylor, Peter Plastrik, Julia Coffman, Anne Whatley

    Evaluating Networks for Social Change: A Casebook
  • publication Jul 2014

    A Practical Guide to Evaluating Systems Change in a Human Services System Context

    A consensus has grown in the public, philanthropic, and nonprofit sectors that we must use systems change approaches to effectively address society’s most intractable challenges. Evaluating systems change efforts can be messy, if not outright overwhelming. This guide offers a pragmatic framework. 

    By Nancy Latham

    A Practical Guide to Evaluating Systems Change in a Human Services System Context
  • publication Jun 2014

    Evaluation and Learning for Complexity and Aligned Action: Framing Brief

    This framing brief developed for the 2014 Evaluation Roundtable convening explores whether and how sectoral shifts in strategy mindset and practice toward more complexity and emergence call for changes in the role of evaluation and learning in foundations. 

    By Tanya Beer, Julia Coffman

    Evaluation and Learning for Complexity and Aligned Action: Framing Brief
  • publication May 2014

    How Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision Making

    Decades of research have shown that despite the best of intentions, and even when actionable data are presented at the right time, people do not automatically make good and rational decisions. This brief highlights common cognitive traps that can trip up philanthropic decision making, and suggests straightforward steps to counter them. 

    By Tanya Beer, Julia Coffman

    How Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision Making
  • publication Mar 2014

    Evaluation for Strategic Learning: Assessing Readiness and Results

    Evaluation for strategic learning is the use of data and insights from a variety of information-gathering approaches—including evaluation—to inform decision-making about strategy. This brief explores organizational preparedness and situational suitability for evaluation that supports strategic learning, and how to understand if this type of evaluation is working.  

    By Anna Williams

    Evaluation for Strategic Learning: Assessing Readiness and Results
  • publication Jan 2014

    Monitoring and Evaluation for Human Rights Organizations: Three Case Studies

    The promotion and protection of human rights around the world is driven by principles of transparency and accountability. These same principles drive monitoring and evaluation (M&E) efforts. Yet, conceptual, capacity, and cultural barriers often discourage the use of M&E in human rights work. This brief offers concrete examples of how to tackle the unique challenges of evaluating human rights work.

    By Rhonda Schlangen

    Monitoring and Evaluation for Human Rights Organizations: Three Case Studies
  • publication Nov 2013

    Pathways for Change: 10 Theories to Inform Advocacy and Policy Change Efforts

    One of our most popular publications, this brief, produced in collaboration with ORS Impact, summarizes 10 theories grounded in social science about how policy change happens. The theories can help to untangle beliefs and assumptions about the inner workings of the policymaking process and identify causal connections supported by research to explain how and why a change may or may not occur.

    By Sarah Stachowiak

    Pathways for Change: 10 Theories to Inform Advocacy and Policy Change Efforts
  • publication May 2013

    Eyes Wide Open: Learning as Strategy Under Conditions of Complexity and Uncertainty

    How can foundations avoid the traps that sabotage their learning and hamper their ability to guide strategy in complex contexts? This article explores a series of self-created “traps," including 1) linearity and certainty bias; 2) the autopilot effect; and 3) indicator blindness.

    By Patricia Patrizi, Elizabeth Heid Thompson, Julia Coffman, Tanya Beer

    Eyes Wide Open: Learning as Strategy Under Conditions of Complexity and Uncertainty
  • presentation Apr 2013

    Evaluation in Foundations: 2012 Benchmarking Data

    This presentation, developed for the 2012 Evaluation Roundtable convening, examines how foundations structure their evaluation and learning functions, invest in evaluative activities, and use evaluative information. Findings are based on surveys from 31 foundations with a strong commitment to evaluation – and 38 foundations that participated in interviews.

    By Tanya Beer, Julia Coffman, Patricia Patrizi, Elizabeth Heid Thompson

    Evaluation in Foundations: 2012 Benchmarking Data
  • publication Mar 2013

    The Art of the Nudge: Five Practices for Developmental Evaluators

    Conventional program evaluation is a poor fit for the uncertain and emergent nature of innovative and complex initiatives. Developmental evaluation offers an alternative. This article offers five practices to help developmental evaluators detect and support opportunities for learning and adaptation leading to right-timed feedback.

    By Marc Langlois, Natasha Blanchet-Cohen, Tanya Beer

    The Art of the Nudge: Five Practices for Developmental Evaluators
  • publication Jan 2013

    Benchmarking Evaluation in Foundations: Do We Know What We Are Doing?

    Evaluation in philanthropy--with staff assigned to evaluation-related responsibilities--began in the 1970s and has evolved, along with philanthropy, in the decades since. This Foundation Review article presents findings, based on 2012 research, about what foundations are doing on evaluation and discusses their implications.

    By Julia Coffman, Tanya Beer, Patricia Patrizi, Elizabeth Heid Thompson

    Benchmarking Evaluation in Foundations: Do We Know What We Are Doing?