When Evaluation Is a Performance, Not a Tool

There’s a kind of evaluation that looks great on paper but tells you nothing. It’s clean, polished, and box-ticking and completely useless.Blog post description.

Sheryl Foster

8/12/20253 min read

There’s a kind of evaluation that looks great on paper but tells you nothing. It’s clean, polished, and box-ticking and completely useless.

We’ve all seen it: glossy reports stuffed with numbers no one reads and insights no one uses. Evaluation turns into a performance. Something we do to prove we’re legitimate, rigorous, fundable. Not something we actually use.

Mario Morino, in Leap of Reason, doesn’t sugarcoat it: If you don’t have good information, you’re flying blind. If you don’t use the information, you’re flying blind on purpose. That’s the gap between performance and practice. One exists to impress. The other exists to learn and adapt.

Why This Happens
The sector is drowning in performative evaluation. Not because leaders don’t care about learning, but because the system rewards appearance over substance.

Funders ask for evidence but don’t fund it. Executives demand evaluation but never look at it. Program teams are told to “reflect” after sprinting through three initiatives back-to-back. The output? Something safe. Something tidy. Something that won’t make anyone uncomfortable.

But learning doesn’t happen in safe, tidy spaces. It happens when you admit what didn’t work, when you follow the odd data point, when you’re willing to be wrong before you get it right. Real evaluation doesn’t just celebrate wins; it also exposes the misses. If your report never contains bad news, that’s not rigor, it’s risk management.

The Cost of the Performance
When evaluation becomes theater, the point is lost. You waste staff time. You produce shallow learning. You hand decision-makers a false sense of clarity. Worst of all, you miss your shot at getting better.

I’ve seen organizations afraid to write “not sure” or “still learning” in a report. Yet the strongest leaders treat evaluation like a lab, not a courtroom. They’re not building a defense. They’re trying to figure out what’s actually going on.

From Compliance to Capacity
Too often, evaluation is treated like a scorecard. It doesn’t have to be punitive. Morino makes the point well: Performance isn’t about punishing people, it’s about helping them succeed.

When staff see data as a support, not a spotlight, they’re more likely to engage with it, learn from it, and use it.

What Useful Evaluation Looks Like
Useful evaluation clears a path to a real decision. That’s the bar.

It doesn’t have to be fancy. It doesn’t have to impress. It just has to inform the work. That could mean spotting a red flag early, catching a surprising trend, or giving frontline staff permission to speak up about what’s not landing.

Sometimes that’s a survey. Sometimes it’s a sticky note on a wall. Sometimes it’s one question in a meeting: “What would we do differently next time, and why?”

Try This Instead
If your team is stuck in performance mode, stop polishing the stage lights and start checking the wiring. Ask:

· Who is this evaluation really for? If it’s just for a funder who’ll skim the executive summary, you’re aiming at the wrong target. The first audience should be the people who can act on the findings, such as your staff, your board, or your partners.

· What decision will this help us make? If you can’t link a metric to a decision, then scale, change, stop, or double down. Why measure it? Good evaluation clarifies choices, not just performance.

· Are we collecting data because we need it, or because we’re afraid not to? Fear-based evaluation fills folders, not strategy meetings.

If your honest answer to any of these is “we’re not sure,” you’ve already started learning. The next step is to design evaluation that answers your own most urgent questions, not just someone else’s checklist. That’s when it becomes a tool, not a costume.

💬 What’s the most performative evaluation request you’ve ever seen?
💬 What’s something simple your team did that actually led to a change?