Cross-post: Rethinking Outcomes in a Crisis Context

  • By Hallie Preskill and Kathleen Lis Dean, PhD, August 5, 2020

This post was created by members of the GEO Strategic Learners Network, a community of GEO members working at the intersection of learning, evaluation and strategy. For more information, including how to join the Strategic Learners Network, please click here.

In this critical time, and with a sense of urgency, many philanthropic organizations are shifting the ways in which they are supporting grantees. In addition to developing and deploying rapid response funding mechanisms and participating in broader, collaborative funding efforts, they are lessening the burden on grantees by:

  • Allowing grant resources to be redirected from programs to general operating support;
  • Hitting the pause button on strategic and programmatic evaluations;
  • Providing early payouts;
  • Approving extensions;
  • Moving remaining 2020 funding to COVID-response efforts; and,
  • Though challenging: postponing or suspending the usual grantee reporting requirements.

Grantees have appreciated the loosening of these requirements as they focus their service delivery efforts and organizational continuity. However, foundation staff are increasingly wondering about the last point: “What grantee outcomes will we be able to report on if they are not submitting reports?,” “What do we mean by relaxing reporting requirements?,” “How can we ensure that our funds are used effectively?,” and “For how long should these changes be in effect?”

The issue at the core of this situation is “What kinds of outcomes will result from this crisis and how can we best listen for and understand them?” There will be outcomes, but the ones we focus on are not necessarily those that grantees originally proposed. So, how can foundations ensure that the content and process of evaluating outcomes in the wake of this crisis are useful and least disruptive, while also ensuring that we learn during this time? As Sanjeev Sridharan proposes, “The pandemic provides an opportunity for us to ask ourselves how evaluations can be adapted to be helpful at a time of crisis.” And, we would add, in ways that do not get in the way of service delivery.

In the spirit of evaluative thinking and learning, and in place of grantee reports and full-blown evaluations, we believe there are alternative ways of assessing the choices foundations have made and the actions both foundations and grantees have taken. First, as always, we start with questions foundations should ask themselves:

  • What are our expectations of what grantees are doing during this time?
  • How has grantees’ work shifted? Are there things they have stopped doing for the time being in order to focus on community needs brought on by COVID-19 and its rippling impacts, and have they started doing other things that we should consider capturing through our learning activities?
  • What is important to understand about grantees’ work and their contexts as it relates to our strategy?
  • How would we define success during this time (with regard to the grantees’ use of the foundation’s funds)? How will we know if and how our support made a difference to grantees and the communities they serve?
  • What information does program staff need to be able to report to leadership and the board about how grantees have used the foundation’s resources during this time of crisis?
  • What do program staff need to consider in a future period of response, recovery, or reset?

The answers to these questions relate to one or more of the following categories of outcomes:

  • Grantee activities to serve communities, including how they’ve focused on serving those most in need (structural equity lens)
  • Grantee reach to community members—breadth and depth
  • Influence within the community and with policymakers for services and change
  • Grantee resiliency—extent to which foundation support has contributed to grantee sustainability and health
  • Grantee connection to community—extent to which and how new connections have been made or strengthened
  • Grantee efforts to support organizers and advocates to improve access and services
  • Grantee ability to adapt and respond quickly and effectively
  • Grantee efforts to collaborate and work on systems changes

Finally, foundations should consider how to collect information on the guiding questions and outcomes. Since the goal would be to reduce the burden on grantees, choosing an approach requires that we be adaptive and humble. In the spirit of learning whatever is possible, we suggest the following ways of gathering insights about the grantees’ recent work:

  • Interview a sample of grantees to gather examples, stories, and any available metrics they are collecting. This method would surface stories of adaptation, resilience, strength, need, coordination, and insights.
  • Host and facilitate virtual convenings with clusters of grantees to gather information about their activities and what they’re learning (this would also facilitate peer learning)
  • Send a brief, 3–5 open-ended question survey to all or a sample of grantees asking for an update on their activities and examples or stories of effects, influence, and impact.
  • Develop a standard protocol for program officers to use in conversations with all grantees in the next few months.
  • When possible, coordinate data collection with other stakeholders and funders to reduce reporting times for grantees.

While each of these methods requires time from grantees, it is possible that they will want to participate and share their stories with funders. These questions and methods may help them process aloud what they’ve been doing and give them space for reflection. Furthermore, gathering this information has the potential to strengthen the grantee/foundation relationship by demonstrating a willingness to be flexible and to understand how they have responded, as well as to communicate the value of grantee input. Using these data collection approaches also means that the data can be shared with others in the field as a means for informing field-level knowledge and practice…in a way that individual grant reports cannot (such as the Fund for Shared Insight).

The information (data) collected from these efforts could weave together a narrative of how grantees are pivoting and responding in this time of uncertainty, hardship, and significant need, and how the foundation’s resources are contributing to achieving meaningful unexpected (or unplanned for) outcomes and impacts. Some of the changes instituted might be so effective that they become standard practice in the future. Who knows—perhaps we will look back on this time and wonder why we didn’t do things differently before the pandemic. It feels like anything is possible today.

Hallie Preskill

Managing Director

Hallie Preskill, PhD, is a Managing Director with FSG, where she leads the Strategic Learning and Evaluation practice, and advises on a wide range of evaluation and learning projects. She also contributes to the field by co-writing tools, guides, articles, and white papers on a range of evaluation and learning topics. Prior to joining FSG, Hallie spent more than 20 years in academia, teaching about evaluation, training, and organizational learning. She has written several books and articles on evaluation as a catalyst for individual, group, organizational, and community learning. Dr. Preskill was President of the American Evaluation Association in 2007.

Kathleen Lis Dean, PhD

Senior Director for Evaluation, Outcomes, and Learning

Kathleen Lis Dean, PhD, is the Senior Director for Evaluation, Outcomes and Learning for the Saint Luke’s Foundation in Cleveland, Ohio. In this role, Dr. Dean facilitates organizational learning to inform strategy and improve philanthropic practice in support of the Foundation’s mission to achieve health equity. Kathleen also serves as a mentor for the Higher Learning Commission’s Assessment Academy, providing guidance and advice for colleges and universities working to build their capacity for student learning assessment. Prior to joining the Foundation, she served as an organizational development consultant with ModernThink, in institutional effectiveness roles at John Carroll University and the University of Maryland, and as a career development professional at Wellesley, Bryn Mawr, and Haverford Colleges.