Search Results

Showing 25-36 of 50

  • publication Jul 2014

    Evaluating Networks for Social Change: A Casebook

    This casebook profiles nine evaluations of network effectiveness that are designed to fit with how networks develop and function.

    By Julia Coffman

    Evaluating Networks for Social Change: A Casebook
  • presentation Jul 2014

    Evaluation and Learning for Aligned Action: 2014 Evaluation Roundtable Convening

    This presentation provides the context for the 2014 Evaluation Roundtable convening, as well as discussion outlines, benchmarking data, case examples, and key lessons and implications for evaluation and learning. 

    By Tanya Beer, Julia Coffman

    Evaluation and Learning for Aligned Action: 2014 Evaluation Roundtable Convening
  • publication Jul 2014

    First Among Equals: The Evaluation of the McConnell Foundation Social Innovation Generation Initiative

    This teaching case explores both the real and potential evaluation of a foundation strategy focused on social innovation in Canada. 

    First Among Equals: The Evaluation of the McConnell Foundation Social Innovation Generation Initiative
  • publication Jun 2014

    Evaluation and Learning for Complexity and Aligned Action: Framing Brief

    This framing brief developed for the 2014 Evaluation Roundtable convening explores whether and how sectoral shifts in strategy mindset and practice toward more complexity and emergence call for changes in the role of evaluation and learning in foundations. 

    By Tanya Beer, Julia Coffman

    Evaluation and Learning for Complexity and Aligned Action: Framing Brief
  • publication May 2014

    How Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision Making

    Decades of research have shown that despite the best of intentions, and even when actionable data are presented at the right time, people do not automatically make good and rational decisions. This brief highlights common cognitive traps that can trip up philanthropic decision making, and suggests straightforward steps to counter them. 

    By Tanya Beer, Julia Coffman

    How Shortcuts Cut Us Short: Cognitive Traps in Philanthropic Decision Making
  • presentation Apr 2013

    Evaluation in Foundations: 2012 Benchmarking Data

    This presentation, developed for the 2012 Evaluation Roundtable convening, examines how foundations structure their evaluation and learning functions, invest in evaluative activities, and use evaluative information. Findings are based on surveys from 31 foundations with a strong commitment to evaluation – and 38 foundations that participated in interviews.

    By Tanya Beer, Julia Coffman

    Evaluation in Foundations: 2012 Benchmarking Data
  • publication Jan 2013

    Benchmarking Evaluation in Foundations: Do We Know What We Are Doing?

    Evaluation in philanthropy--with staff assigned to evaluation-related responsibilities--began in the 1970s and has evolved, along with philanthropy, in the decades since. This Foundation Review article presents findings, based on 2012 research, about what foundations are doing on evaluation and discusses their implications.

    By Tanya Beer, Julia Coffman

    Benchmarking Evaluation in Foundations: Do We Know What We Are Doing?
  • presentation Dec 2012

    Strategy Implementation Addressing the Disconnect between Decisions and Data: 2012 Evaluation Roundtable Convening

    Drawing on benchmarking data gathered from the Evaluation Roundtable network, this presentation examines organizational barriers to learning from strategies, warns against cognitive traps that hinder learning and decision-making, and describes approaches to avoid or counteract those traps. It explores how the culture and role of evaluation in foundations can disconnect learning from strategy.

    By Tanya Beer, Julia Coffman

    Strategy Implementation Addressing the Disconnect between Decisions and Data: 2012 Evaluation Roundtable Convening
  • publication Nov 2012

    Evaluation for Models and Adaptive Initiatives

    This Foundation Review article outlines the difference between evaluation for two main types of grantmaking programs: models, which provide replicable or semi-standardized solutions, and adaptive initiatives, which are flexible programming strategies used to address problems that require unique, context-based solutions. 

    By Julia Coffman

    Evaluation for Models and Adaptive Initiatives
  • publication Aug 2012

    Advocacy & Public Policy Grantmaking: Matching Process to Purpose

    How should foundations approach their advocacy and public policy grantmaking? This report offers three options and explores what it means for foundations to design grantmaking that builds the capacity and influence of an advocacy field.

    By Tanya Beer

    Advocacy & Public Policy Grantmaking: Matching Process to Purpose
  • publication Aug 2012

    Evaluating Social Innovation

    Common evaluation approaches and practices can constrain the kind of continuous learning and adaptation that is necessary for innovation. This brief offers lessons about an emerging evaluation approach—developmental evaluation—which supports the adaptation that is so crucial to innovation.

    By Tanya Beer

    Evaluating Social Innovation
  • publication Sep 2011

    Evaluation of the David and Lucile Packard Foundation’s Preschool for California’s Children Grantmaking Program

    This teaching case focuses on an evaluation that aimed for real-time strategic learning over the course of a 10-year initiative. It chronicles the challenges with achieving that while the strategy shifted and evaluators grappled with how to provide information that was of real value to the work. 

    Evaluation of the David and Lucile Packard Foundation’s Preschool for California’s Children Grantmaking Program