When you’re evaluating your nonprofit programs, you need not just good data, but data that is relevant to your success.

Data That Helps, Not Harms

One problem that I’ve often seen is collecting irrelevant data—data that is convenient, but that you don’t need. Collecting data that you won’t use wastes staff and participant time. People may get annoyed. At best, you might use the data later, like that bag of spare parts to who-knows-what that you keep in a drawer.

More worrisome is relying on overly isolated data. Isolated data can put your organization at a disadvantage if it doesn’t capture your true value.

Say I’m giving a workshop, and I want to evaluate the results (because I love evaluating!). This past winter, I delivered a workshop where four people attended. If that was all we knew about that workshop, we wouldn’t know why only four people came. We might guess that few people were interested. However, a turnout of four people looks a lot better when you know that it was during a snowstorm here in the Washington, DC area. We don’t like to drive in the snow.

Think Causal Connections

One data point, by itself, tells you that something happened. In some situations, such as in a laboratory experiment, you can safely assume that only one thing could have caused it to happen. Outside the lab, the world is more complicated. When you want to know why something happened, you need to look for many possible causes, as opposed to a single cause. Here are two ways to do that.


  • Include More Perspectives.


Learn all you can about what’s important to success from as many perspectives as possible. That understanding will let you explain how the data you see compare with what you’d expect, based on what is known.

Many sources can provide valuable perspectives that can benefit your evaluation. Reflect on your own experience. Talk with participants, staff, and other stakeholders. Review the knowledge in the field.

Look for things that your organization does (or could do) to achieve desired results, plus things outside your control that are relevant to your situation. These could be anything from the weather, to a national law, to a history of racial and economic disparities.


  • Analyze More Data Together.


As you’re looking at the numbers to measure your outcomes, keep in mind what causes those outcomes. Include those things in your analysis!

Charts are a great place to start to see relationships between two or more things. In the chart below, we can quickly see that four is exactly the number of attendees that we expect to get at the average workshop on a snowy day.

Chart of Average Number of Attendees

Focusing on the causal connections that explain the data is key to program evaluation that works.   

To learn more, join me for my Program Evaluation That Works workshop Thursday, May 9, 9 am to 12 pm ET at Foundation Center Northeast Washington, DC, a service of Candid.

About the Author(s)


Management Staff

Subscribe to our blog

When we publish a new blog post, you’ll get notified by email.

Interested in being a guest writer for our blog? Learn how