Do you want to evaluate whether a measure targeted at preventing or reducing early leaving from (vocational) education and training you oversee is performing well?

Cedefop has developed an evaluation plan for the monitoring and evaluation of specific policies and measures to be used by learning providers who are not experts in the field of evaluation. You can use the plan when developing your monitoring and evaluation approach. The steps below present a set of tasks that need to be completed.

  • Evaluated activities

Provide here a list of the activities you have introduced, which should be evaluated.

Example: activities aimed at boosting young unemployed people’s motivation.

 

  • Period of time covered by evaluation:

Provide here dates (from – to).

Example: from 2014 (when these activities were introduced) to 2016.

  • Reason for introducing above activities and expected results

State for each of the activities why it was introduced, and what it was expected to achieve.

  • Other unexpected effects

State here any other positive or negative effects, which were not initially anticipated.

Example:

For instance, the selection of a young person to participate in a support programme that also includes counselling can be perceived negatively by him/her if going to counselling is perceived as a weakness in the community. This may have a negative impact on his/her motivation to participate in any of the components of the programme. It may be necessary to cover this aspect in the evaluation of the programmes’ retention rates.

  • Key evaluation questions

State here the questions your evaluation should answer.

Examples:

Have young people been engaged in the motivational activities as intended?

What effect do the undertaken activities have on participants’ motivation to look for a job?

What impact do the undertaken activities have on the NEET rate in the municipality?

  • Definition of successful performance

State here the specific results you would like to achieve.

Examples:

Increase in the number of young people NEET registered with PES.

Increase in the share of young people NEET registered with PES by 10%.

Increase in the share of young people NEET registered with PES up to 90%.

To read more about how to determine what constitutes good performance go to section ‘Decide if programme or policy is good enough’ >

  • Recognising priorities

Unpack the intervention logic, unintended effects and judgment criteria into statements that can form the basis for defining indicators.

  • Indicators for making judgments

List your indicators here and make sure they cover inputs, outputs, results, impacts as well as context.

 

See indicator examples relevant to NEETs >

 

  • Data for populating each indicator

Reflect on the data you would ideally like to have for each indicator. If it is not possible to get these data, think about alternative data.

Examples:

Data on the number of practitioners mobilised is available from the measure’s reporting data.

Data on how the measure ‘increased young people’s self-confidence and motivation’ is not available but can be collected (e.g. through an exit interview or survey).

It is not possible to access administrative data on whether a young person who has received a job offer through a programme has not become unemployed again a year later. Therefore, to measure the indicator ‘share of participants find stable employment’ use self-reports of young people on their current activities and future career plans, gathered twelve months after the programme has ended.

  • Data collection tools

State here all the necessary data collection tools (interview questionnaires, survey questionnaires, observation templates, etc.).

 

  • Sample selection strategy (for both qualitative and quantitative approaches)

What sample is needed to collect the data for each method? (e.g. how many young people will be interviewed).

Develop your methodology here.

Example:

A short survey will be conducted over the phone six months after the programme ends. Former participants will be asked about their current activities (if studying, and at which level and programme; if employed, and what kind of contract) and future career plans (if they plan to enrol in further education and training, find a job, or other).

  • Data analysis and presentation

You are most likely to summarise quantitative data by using totals, percentages and averages, and identify examples through the analysis of the qualitative data.

Examples of evaluation reports (please note that these evaluations have not been developed at the provider level and so your evaluation is unlikely to be as complex):

  • Evaluation of the French Youth Guarantee (in French)
  • Evaluation of the Austrian Production Schools (in German)
  • Evaluation of the Irish initiative Youthreach (in English)
  • Evaluation of the UngKOMP measure in Sweden (in Swedish)
  • Evaluation of the Danish measure ‘Building Bridges to Education’ (in Danish)
  • Key indicators evolution over time (before and after the evaluated measure was put in place)

Compare the indicators before the activities were implemented and after.

Qualify (describe it qualitatively) and quantify (assess in numbers) the change.

Example:

The number of previously inactive young people enrolled in upskilling programmes has increased by 15%.

Enrolment has increased in upskilling programmes, in particular in the following programmes…

The self-reported motivations of young inactive people to enrol in an upskilling programme were…

They consider that participating in the activities contributed to their decision to enrol in an upskilling programme…

 

 

  • Costs levels compared to outputs and results achieved

Assess here the financial and human resources that were mobilised to put in place the activities.

Example:

For instance, we can compare the total costs of the programme with the number of young persons who have been reached and the share of those who have experienced positive results.