Skip to content

12. Evaluation Plan

How to use this activity

This activity supports practical application of the concepts in your lesson.

  1. Download this activity as a docx file
  2. Work through the activity step by step. Keep your answers concise and focused
  3. Return to your lesson when you are done.

What to do: Define what evidence you will collect, when, from whom, and how you will use it to improve

Expected output: An evaluation plan linking intended outcomes to evidence collection and improvement actions

Approximate time: 20–30 minutes

Used in

Before you start

You will typically need:

  • Outputs from earlier activities (if applicable)
  • Notes from your current lesson

Instructions

In Activity 2: Theory of Change, you articulated what change you hoped your training would contribute to. In Activity 10: Assessment Plan, you designed ways to check whether learners are learning during the training itself. Now you are asking a different question: did the training contribute to real-world change after learners left the room?

Evaluation looks beyond the training event. It asks whether the outcomes you designed for actually translated into changed practice, new capabilities applied in context, or shifts in the systems you mapped in Activity 1. This activity helps you build a plan for finding out.

You will use


Build your evaluation plan

Step 1: Name the change you are looking for

Return to your Theory of Change from Activity 2. Look at the outcomes and impact you articulated — the changes that sit beyond the training event itself.

  • What change do you hope to see in your learners' practice after the training? (Be specific: not "they will know more" but "they will use X method in their work" or "they will share materials with colleagues.")
  • What change do you hope to see in the broader system? (e.g., a team adopts a new workflow, an institution updates its approach, a community gains a new capability.)
  • Over what timeframe? (Some changes are visible in weeks; others take months or years. Be realistic.)

Step 2: Identify evidence that would indicate change

For each change you named, ask: what would I see, hear, or find if this change were actually happening?

Change 1:

  • What evidence would indicate this change? (e.g., learners report using the skill, you observe changed practice, outputs or products are different, colleagues confirm a shift)
  • Where would you find this evidence? (e.g., follow-up survey, workplace observation, review of learner outputs, conversation with managers or peers)
  • From whom? (The learners themselves? Their colleagues? Their supervisors? Their own learners?)

Change 2:

  • What evidence would indicate this change?
  • Where would you find this evidence?
  • From whom?

Add more as needed for each change from Step 1.


Step 3: Plan when and how to collect evidence

Evidence you never collect is evidence you don't have. Be concrete about logistics.

For each piece of evidence from Step 2:

  • When will you collect it? (e.g., 2 weeks after training, 3 months after, at a follow-up session)
  • How will you collect it? (e.g., short survey, follow-up interview, email check-in, observation visit, review of shared outputs)
  • What is the minimum viable version? (If your ideal plan is not feasible, what is the simplest version that still gives you useful information? A 3-question email may tell you more than a survey you never send.)

Step 4: Distinguish signal from noise

Not all evidence is equally useful. Some changes are easy to measure but do not mean much. Others are hard to capture but matter deeply.

Review your evidence plan and ask:

  • Which indicators are "easy but weak"? (e.g., satisfaction scores, attendance at follow-up events, self-reported confidence.) These tell you something, but they do not tell you whether practice changed.
  • Which indicators are "harder but stronger"? (e.g., observable changes in how learners work, new outputs they produce, feedback from people they work with.) These are more effort to collect but give you a clearer picture.
  • What could create a misleading signal? (e.g., learners say they changed but haven't, or external factors caused the change rather than your training.)

For at least one indicator, note: "This would look like change but might not be" and explain why.


Step 5: Plan how you will use what you learn

Evaluation is not a reporting exercise. It is a design tool. The point is to improve the next iteration of your training.

  • If the evidence shows the change you hoped for: What worked? What will you keep, and what will you refine?
  • If the evidence shows no change or unexpected outcomes: What might explain this? Was the training design the issue, or were there system-level barriers (the kind you identified in Activity 1)?
  • What specific part of your training would you revisit first? (e.g., the outcomes, the activities, the assessment, the follow-up support)
  • Who will you share the findings with? (Funders, co-facilitators, learners, your own team?)

If you already have a training

You may already have some evaluation data — even informal.

  • What do you currently know about whether your training leads to change after the event? How do you know it?
  • Where are the gaps in what you know? For each gap, add one concrete evidence-collection step to your plan.
  • If you have collected feedback before, review it honestly: how much of it tells you about satisfaction, and how much tells you about actual change in practice?

If you are creating a new training

You are planning evaluation before you have delivered, which is the right time to think about it — not after.

  • Start from your Theory of Change. Pick the two outcomes you care about most and focus your evaluation plan on those. You do not need to evaluate everything.
  • For timing, plan at least one touchpoint after the training ends. Even a short follow-up email at 4 weeks asking "What have you applied?" gives you real evidence.
  • Think about who can help you collect evidence. If your learners' managers or colleagues could tell you whether practice changed, consider including them in your plan — with learners' consent.
  • Keep it feasible. A realistic plan you actually execute is worth more than an ambitious plan that sits in a drawer.

Context check

  • What constraints limit your ability to follow up with learners after the training? (e.g., no contact details, institutional barriers, learners in different organisations or countries)
  • What resources (time, tools, people) would you need to implement this plan? Do you have them?
  • Are there ethical considerations in how you collect evidence? (e.g., consent, anonymity, power dynamics between you and learners)

Reflection

  • You started this workbook by articulating what change you hoped for (Activity 2), and now you are asking whether the training contributed to it. What has shifted in how you think about the relationship between your training and the change you want to see?

Reuse in later sections

Your evaluation plan feeds into: