Book contents
- Frontmatter
- Contents
- List of figures
- List of examples
- Acknowledgements
- Preface
- Glossary of selected evaluation terms
- 1 Introduction
- 2 Compilation: setting the right foundations
- 3 Composition: designing for needs
- 4 Conducting process evaluation
- 5 Conducting economic evaluation
- 6 Conducting impact evaluation
- 7 Analysis, reporting and communications
- 8 Emerging challenges for evaluation and evaluators
- References
- Annex A The ROTUR framework for managing evaluation expectations
- Annex B Ready reckoner guide to experimentation choices in impact evaluation
- Index
- Social Research Association Shorts
8 - Emerging challenges for evaluation and evaluators
Published online by Cambridge University Press: 05 April 2022
- Frontmatter
- Contents
- List of figures
- List of examples
- Acknowledgements
- Preface
- Glossary of selected evaluation terms
- 1 Introduction
- 2 Compilation: setting the right foundations
- 3 Composition: designing for needs
- 4 Conducting process evaluation
- 5 Conducting economic evaluation
- 6 Conducting impact evaluation
- 7 Analysis, reporting and communications
- 8 Emerging challenges for evaluation and evaluators
- References
- Annex A The ROTUR framework for managing evaluation expectations
- Annex B Ready reckoner guide to experimentation choices in impact evaluation
- Index
- Social Research Association Shorts
Summary
• Moving beyond ad hoc evaluation by integrating development and policy design
• Balancing rising user expectations with diminishing resources
• Shortening decision-making timeframes and possibilities for real-time analysis
• Building confidence and credibility through proactive engagement in evaluation
• Tackling relativism and proportionality imaginatively and Constructively
• Next steps for evaluators, professional networking and guidance
Introduction
The previous chapters show how evaluators now have a lot of options and consequently choices to make, to compile, compose and conduct an evaluation that can bring robust and credible evidence into decision-making. These options have greatly expanded in the last two decades especially, and they will no doubt develop further in the next 10 to 20 years. It is beyond this author to predict how scholars and practitioners will add to these opportunities, but it is possible to take a forward look at some of the issues they will be responding to. The space and time available here can only scratch the surface of the challenging and often exciting possibilities that readers may face. What is clear is that evaluators who are able to think robustly, responsively and creatively will be in demand.
Integrating development and evaluation design
If it were possible to condense good design of evaluation into a handful of rules, perhaps the first would be ‘timing is everything’. The UK's Magenta Book, the evaluation handbook for those in government (and outside) puts it this way:
the design and implementation of a policy affects how reliably it can be evaluated, and even quite minor adjustments to the way a policy is implemented can make the difference between being able to produce a reliable evaluation … and not being able to produce any meaningful evidence. (HM Treasury, 2011, p 25)
This is a sound caution and points to the need for good evaluation to be designed alongside an intervention and not as an afterthought. However, even within government, in the UK and outside, other considerations often come ahead of taking this into account. Outside central government it is all too rarely observed. The result is that too often evaluators find themselves coming late to the party; the intervention is planned and may have been running for months (or years) before the evaluators are called in.
- Type
- Chapter
- Information
- Demystifying EvaluationPractical Approaches for Researchers and Users, pp. 157 - 172Publisher: Bristol University PressPrint publication year: 2017