This teaching case focuses on an evaluation that aimed for real-time strategic learning over the course of a 10-year initiative. It chronicles the challenges with achieving that while the strategy shifted and evaluators grappled with how to provide information that was of real value to the work.
Teaching cases are factual stories of one foundation’s in-depth experiences related to evaluation and learning. Stories highlight important challenges that confront foundations in their evaluation work, and put readers in the role of decision makers who are confronted with problems and options for solutions as the story unfolds. This teaching case was produced for the Evaluation Roundtable, a network of evaluation and learning leaders in foundations.
“Real-time evaluation” or “developmental evaluation” aims to promote strategic learning by providing regular, timely information about how a strategy is unfolding, which organizations then use to inform future work and make course corrections. This case tells the story of how external evaluators and program staff at the David and Lucile Packard Foundation took a risk on what for them was a new and nontraditional approach to evaluation.
To be successful with this evaluation approach, a specific organizational and team culture must be present. The evaluators had tried this approach at another foundation and experienced several challenges. While they had faith in this newer evaluation approach, they were also wary from their earlier experience.
The 10-year Preschool for California’s Children grantmaking program was among the largest and most audacious programs that Packard had ever funded. Hand in hand with this bold strategy, Packard contracted for an evaluation approach throughout the initiative’s lifecycle.
The case chronicles the evaluation’s nine-year evolution, and identifies key points at which it switched course because methods were not working, or because the Foundation’s strategy shifted. It highlights several questions and challenges, all of which are relevant to strategic learning approaches:
- How to ensure the evaluation is useful to multiple audiences (board, funder, grantees)
- How to “embed” the evaluator in a reasonable way while maintaining role boundaries
- How to manage often competing learning versus accountability needs
- How to time data collection so it is “just in time” but also reliable and credible
How to get information that does not just verify what program officers already know.