Do you want to evaluate whether a policy or measure targeted at preventing or reducing early leaving from (vocational) education and training you oversee is performing well?

Cedefop has developed an evaluation plan for the monitoring and evaluation of specific policies and measures to be used by policy-makers and other stakeholders who are not experts in the field of evaluation. You can use the plan when developing your monitoring and evaluation approach. The steps below present a set of tasks that need to be completed. They do not define who does them – whether it is the evaluation steering committee, the evaluator or the client commissioning the evaluation. This responsibility will differ from case to case.

 

  • Title of the measure

Describe here the policy/measure to be evaluated.

  • Differences from other existing policies/measures

Compare here the evaluated policy/measure with similar policies/measures.

  • Evaluation period of time

Define the period of time that will be covered in the evaluation (from - to).

Example: from 2012 (when changes to the mentoring programme were introduced) to 2016.

  • Geographical scope of the evaluation

Define the regions covered by the evaluation.

Example: Geographical scope: region X

  • Objectives and how it is expected to achieve them
    Review policy/measure documentation (e.g. guidelines, manuals), carry out interviews with those who designed the policy/measure.
  • Required changes for achieving objectives
    Design a visual presentation showing the logical chain from inputs and activities to outputs, results and impacts.
  • Unexpected (negative/positive) results
    • Interview people who are knowledgeable about the policy/measure performance (i.e. those in charge of designing and implementing the measure. Youth coaches or counsellors involved in the implementation of the measure can be included).
    • Evaluate risks and state what could go wrong.

Example:

  • Results are worse than expected due to difficulties in implementation e.g. insufficient resources at provider level; low commitment of institutional leaders or mentors.
  • Young people are offered only short-term employment contracts, and this leads to low motivation.

 

  • Key evaluation questions
    State here the questions your evaluation should answer.

Examples:

Have NEETs been reached and engaged in the measure as intended?

What effect does the measure have on participants’ motivation to learn / find a job?

What impact does the measure have on the NEET rate in our municipality?

  • Evaluation results
    State here what data you would like to have gathered at the end of the evaluation.

Key question: What will be considered as good performance?

  • Definition of successful performances
    State here your criteria for successful measures.
  • Examples:

    Increase the number of youth centres which have outreach programmes to NEETs.

    Increase the number of youth centres which have outreach programmes to NEETs by 20%.

    Increase the share of youth centres which have outreach programmes to NEETs up to 50% of youth centres.

To read more about how to determine what constitutes good performance go to section ‘Deciding if our programme or policy is good enough’ >

  • Recognising priorities

Unpack the intervention logic, unintended effects and judgement criteria into statements that can form the basis for defining indicators.

  • Indicators for making judgements

List your indicators here and make sure they cover inputs, outputs, results, impacts as well as context.

See indicator examples for relevant to NEETs >

  • Data to measure indicators

Reflect on the nature of the indicator and what data you would ideally want to have. If this is not possible, think about alternative sources of data.

Examples:

  • Data on the ‘number of mentors mobilised’ is available from the measure’s reporting data.
  • Data on how the measure ‘increased participants’ self-confidence and motivation’ is not available but can be collected (e.g. exit interview or survey).
  • It is not possible to access administrative data on whether a young person who has received a job offer through a programme has not become unemployed again a year later. Therefore, to measure the indicator ‘share of participants who find stable employment’ use self-reports of young people on their current activities and future career plans, gathered twelve months after the programme has ended.

 

  • Data collection tools
    State here all the necessary data collection tools (interview questionnaires, survey questionnaires, observation templates, etc.).
  • Sample selection strategy (for both qualitative and quantitative approaches)
  • Develop your methodology reflecting on:
    • the capacity you have and the resources available;
    • the extent to which secondary data can be used;
    • what primary data needs to be collected.

Example:

A survey will be conducted over the phone six months after the programme ends. Former participants will be asked about their current activities (if studying, and at which level and programme; if employed, and what kind of contract) and future career plans (if they plan to enrol in further education and training, find a job, or other).

  • Ways of data patterns identification (both for qualitative and quantitative data)
    State the most suitable data analysis techniques given the evaluation questions.

 

  • Data presentation
    State here the techniques you are going to use for presenting your data.

Examples: Recent evaluation reports from France, Austria, Ireland, Sweden and Denmark:

  • Evaluation of the French Youth Guarantee (in French)
  • Evaluation of the Austrian Production Schools (in German)
  • Evaluation of the Irish initiative Youthreach (in English)
  • Evaluation of the UngKOMP measure in Sweden (in Swedish)
  • Evaluation of the Danish measure ‘Building Bridges to Education’ (in Danish)
  • Key indicators evolution over time (before and after the intervention)
    Compare the measurement before and after the policy/measure has been put in place to identify trends over time.

Example:

The number of youth centres which have outreach plans to NEETs has increased by 30% after the requirement to have an outreach plan was introduced.

Newly adopted outreach plans are similar to the previously existing ones in that…

New outreach plans include some new features…

  • To what extent is the change identified a result of the policy/measure?
  • What would have happened anyway in absence of the policy/measure?
  • Costs levels compared to outputs and results achieved
    Calculate costs per output and where possible cost per result.

 

  • Costs comparison to other, similar, policies/measures
    Make comparisons with similar policies/measures.

 

  • Costs comparison for activities within the policy/measure
    Make comparisons across different components of the policy/measure.

Example:

The evaluation of the UK measure ‘The Youth Contract for 16-17 year olds not in education, employment or training evaluation’ included a cost-benefit analysis. It subtracted the estimated direct and indirect costs of the programme from the estimated long-term benefits of participating in it.

It examined the impact of additional qualifications resulting from participation in the programme, on increased lifetime earnings, improved health, and reduced criminal activity.

Read 2014 evaluation report (in English) >