For over 15 years, we have been supporting practitioners in different sectors to build their evaluation capacity in order for them to better meet the growing demand to demonstrate the effectiveness and value of their work.
We run introduction to evaluation workshops for policy makers and analysist across a range of government departments as well as methodology courses (particularly in theory of change), have developed a series of workshops on ‘dynamics of evaluation’ addressing the people skills of delivering evaluation activities in challenging environments, as well as co-delivering modules of the CECAN training on complexity in evaluation.
The British Council, like many other organisations working internationally and as well as locally, recognises that they too need to evidence the difference they make. The British Council are also aware that many staff across all the regions they work in may have an interest in and/or requirement to engage with understanding and evidencing impact but it is likely that many will be in need of evaluation ‘literacy’ training.
In response, the British Council has developed a three-year training and capacity building strategy and we were working with IpsosMORI to deliver the first two phases of this work through nine intensive training courses across four geographical regions and mentoring of a small number of Council staff. The design of the training reflected the need for it to be firmly located within local contexts including social, political, thematic, programme and project considerations.
For example, while the British Council’s core training was relevant across all regions, the evaluation training was tailored to consider specific regional issues that impact on evidence gathering (e.g. where projects/programmes are dispersed over wide geographic areas, where IT equipment and/or skills are limited, or where gender/power relations affect access to understanding impact).
Contextually relevant content also engages participants and help cement learning. A pilot course was delivered in London, UK, with attendees from wider Europe, the Americas and beyond. Building on programme design and Theory of Change development, participants welcomed the fact that we were not training them as evaluators. Instead, we were aiming to provide them with the knowledge and skills as programme and project managers to recognise, commission, manage and use high quality evaluations.
Ipsos MORI and the Tavistock Institute of Human Relations (TIHR) have extensive experience of developing and delivering evaluation and research methods training for a variety of audiences.
Testing, adapting, integrating: three core principles underpinning our evaluation training
We base our approach on three core principles, also embedded in our mentoring:
- A ‘test and learn’ principle: the pilot programme is being rolled out across four regions, with learnings continuously shared and applied to refine training for subsequent use.
- An agile and responsive design and approach to supporting trainees: our training is based on a consistent structure and materials but tailored to meet local context and programme themes.
- Integrating the pilot training with wider British Council evaluation requirements: we do not see this training in isolation from other evaluation activities. Our approach contributes to and learns from other components.
Our course participants have generally welcomed the fact that, alongside the input of information about current policy, theory and practice in the field of evaluation, the days also provide a number of opportunities to put these ideas into practice, working on a ‘case study’ related to their own field of experience. This includes the development of a ‘theory of change’, and exploring how this can help in identifying suitable evaluation questions and data collection methods of the particular policy, programme or project under evaluation.
What participants think about our evaluation training
‘An excellent course, covering the broad universe of evaluation and the strengths and weaknesses of different evaluation approaches’.
‘Extremely digestible and very practical course. Inviting participants to draw on the their experiences was particularly illuminating’.
‘Well run session and highly relevant for my work. Overview of the different evaluation models, the theory of change mapping and working through the case study was particularly helpful, as it made the principles clear and gave us practice’
Dr Kerstin Junge, Giorgia Iacopini, Anna Sophie Hahne, Heather Stradling