Showing 13-22 of 22
publication May 2014
How Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision Making
Decades of research have shown that despite the best of intentions, and even when actionable data are presented at the right time, people do not automatically make good and rational decisions. This brief highlights common cognitive traps that can trip up philanthropic decision making, and suggests straightforward steps to counter them.
publication Mar 2014
Evaluation for Strategic Learning: Assessing Readiness and Results
Evaluation for strategic learning is the use of data and insights from a variety of information-gathering approaches—including evaluation—to inform decision-making about strategy. This brief explores organizational preparedness and situational suitability for evaluation that supports strategic learning, and how to understand if this type of evaluation is working.
publication May 2013
Eyes Wide Open: Learning as Strategy Under Conditions of Complexity and Uncertainty
How can foundations avoid the traps that sabotage their learning and hamper their ability to guide strategy in complex contexts? This article explores a series of self-created “traps," including 1) linearity and certainty bias; 2) the autopilot effect; and 3) indicator blindness.
publication Mar 2013
The Art of the Nudge: Five Practices for Developmental Evaluators
Conventional program evaluation is a poor fit for the uncertain and emergent nature of innovative and complex initiatives. Developmental evaluation offers an alternative. This article offers five practices to help developmental evaluators detect and support opportunities for learning and adaptation leading to right-timed feedback.
presentation Dec 2012
Strategy Implementation Addressing the Disconnect between Decisions and Data: 2012 Evaluation Roundtable Convening
Drawing on benchmarking data gathered from the Evaluation Roundtable network, this presentation examines organizational barriers to learning from strategies, warns against cognitive traps that hinder learning and decision-making, and describes approaches to avoid or counteract those traps. It explores how the culture and role of evaluation in foundations can disconnect learning from strategy.
publication Oct 2012
Strategic Learning in Practice: Tools to Create the Space & Structure for Learning
Evaluation that supports strategic learning applies evaluation findings as well as non-evaluation data in order to improve strategy. This brief describes practical steps for designing and using theories of change and strategic learning debriefs as tools for creating the space and structure required for strategic learning.
publication Aug 2012
Evaluating Social Innovation
Common evaluation approaches and practices can constrain the kind of continuous learning and adaptation that is necessary for innovation. This brief offers lessons about an emerging evaluation approach—developmental evaluation—which supports the adaptation that is so crucial to innovation.
publication Sep 2011
Evaluation of the David and Lucile Packard Foundation’s Preschool for California’s Children Grantmaking Program
This teaching case focuses on an evaluation that aimed for real-time strategic learning over the course of a 10-year initiative. It chronicles the challenges with achieving that while the strategy shifted and evaluators grappled with how to provide information that was of real value to the work.
publication Sep 2011
Advocacy Evaluation Case Study: The Chalkboard Project
An evaluation of a civic engagement and advocacy effort in Oregon offered an opportunity to incorporate both retrospective and prospective approaches and new advocacy evaluation tools. It is a case example of how evaluation can support strategic learning among advocates.
publication Jul 2011
Evaluation to Support Strategic Learning: Principles and Practices
Strategic learning refers to efforts to incorporate reliable data and ongoing reflection into social change strategies. This brief digs into the definition of strategic learning. It explores how evaluation can support it, and proposes a set of principles that represent the “non-negotiables” of evaluation for strategic learning.