Are you really ready for developmental evaluation? You may have to get out of your own way.

Developmental evaluation requires changing the way evaluation commissioners and evaluators work together. If it feels like business as usual, you’re probably not doing it.

Developmental evaluation requires changing the way evaluation commissioners and evaluators work together. If it feels like business as usual, you’re probably not doing it.

On May 31, 2019, Community Science sponsored the webinar “Developmental Evaluation: Rewards, Perils, and Anxieties.” A foundation officer (Kristi Ward, Bush Foundation) and two evaluators (Kien Lee, Community Science and Tanya Beer, Center for Evaluation Innovation) discussed what developmental evaluation is intended to be and the realities of practicing it. This article is based on Tanya Beer’s introductory comments.

First, let me clarify that this is not an introduction to developmental evaluation (DE). I am assuming you’ve done some thinking and reading, and perhaps even working, on developmental evaluation. [If not, start hereand then come back and read this before you commission or conduct a developmental evaluation.]

It’s time for a candid conversation about how the conventional routines of both funders and evaluators are getting in the way of high-value DE.

Some of the most basic and taken-for-granted practices of evaluators and evaluation commissioners are at odds with innovation in complex systems.These practices are codified into our evaluation commissioning, designing, contracting, and budgeting routines, and even into evaluation firms’ basic business models. They create tough-to-negotiate tensions with DE.

Tension 1: Our routines around setting pre-determined evaluation questions, methods, timelines, and deliverables do not fit with the uncertainty and unpredictability of innovation in systems. If you start your DE with pre-set questions and a fully developed theory of change, you already have missed the point.

Tension 2: With DE, the need for real-time data is in tension with conventional conceptions of methodological rigor, requiring a constant negotiation of trade-offs between level of certainty and speed.

Tension 3: Conventional contract accountability mechanisms, such as payments triggered when pre-set deliverables are provided, are in tension with the naturally uneven flow of innovation and DE.

Tension 4: The inevitable moment when external evaluation users such as foundation boards and leaders ask “what’s our impact” is in tension with the focus and fundamental purpose of DE.

Developmental evaluation can create anxieties and disappointments for evaluation commissioners and evaluators who apply conventional evaluation routines to an evaluation endeavor that is fundamentally different.

We’ve learned — often painfully — a few basic tips about how to get out of your own way from the very beginning of a DE:

Tips for Evaluation Commissioners:

  • Design Requests for Proposals differently. When creating an RFP for developmental evaluation, don’t lay out all of your evaluation questions and ask applicants for a full description of the evaluation design, methods and deliverables. Instead, identify a few early stage questions that are currently puzzling you, and ask for applicants to suggest some ideas for how they might bring relevant data to the table.
  • Allow for flexible budgeting. Anticipate higher budgets than other types of evaluations, as evaluators need to be much more present at the strategy table in an ongoing way.
  • Increase time and access. Plan for your evaluator to be present at the strategy table far more frequently than for other kinds of evaluation, and an extra amount for the first several months of the engagement as they build a deep understanding of the strategy team’s aims and thinking.
  • Redesign contract terms. Rather than pinning contract accountability and payment to specific predetermined deliverables, consider designing a contract that triggers payment based on a regular performance review.
  • Plan for the impact question. Anticipate that your board will ask questions about the impact of the systems innovation (probably prematurely) and protect the developmental evaluator from this kind of mid-way switch-up in purpose, questions, user, and design.

Tips for Evaluation Consultants:

  • Push back on commissioners. Look for — and push for — all of the above with evaluation commissioners. 
  • Experiment with flexible staffing. Consider how your staffing models could be restructured to allow for rapid deployment of unexpected methods.
  • Allow for unknowns and “I have no ideas”. Check your habit of asking for fully fleshed-out theories of change at the outset.
  • Stop doing midterm and final reports. Instead of packaging and delivering evaluation findings at regular pre-planned intervals, get accustomed to lighter-burden, rapid delivery of insights when the strategy question is actually on the table. 
  • Keep track of developments. It can be useful for developmental evaluators to keep a log of key developments in the innovation, as well as the team’s rationale for its changes in strategy

Above all, just like innovative strategists, developmental evaluators and commissioners need to approach their work together as an innovative enterprise…testing, learning, and adapting the evaluative approach as we go.

READ MORE ON MEDIUM >

Related