Undertaking rapid evaluations during the COVID-19 pandemic: Lessons from evaluating COVID-19 remote home monitoring services in England

Examination of the process of conducting a large-scale rapid evaluation from design to dissemination and impact, and reflects on the key lessons for conducting future large-scale rapid evaluations.

Journal article

Published: 13/02/2023

Rapid evaluations can offer evidence on innovations in health and social care that can be used to inform fast-moving policy and practice, and support their scale-up according to previous research. But there are few comprehensive accounts of how to plan and conduct large-scale rapid evaluations, ensure scientific rigour, and achieve stakeholder engagement within compressed timeframes.

This paper examines the process of conducting a large-scale rapid evaluation from design to dissemination and impact, and reflects on the key lessons for conducting future large-scale rapid evaluations. 

Journal article information

Abstract

Introduction: Rapid evaluations can offer evidence on innovations in health and social care that can be used to inform fast-moving policy and practise, and support their scale-up according to previous research. However, there are few comprehensive accounts of how to plan and conduct large-scale rapid evaluations, ensure scientific rigour, and achieve stakeholder engagement within compressed timeframes.

Methods: Using a case study of a national mixed-methods rapid evaluation of COVID-19 remote home monitoring services in England, conducted during the COVID-19 pandemic, this manuscript examines the process of conducting a large-scale rapid evaluation from design to dissemination and impact, and reflects on the key lessons for conducting future large-scale rapid evaluations. In this manuscript, we describe each stage of the rapid evaluation: convening the team (study team and external collaborators), design and planning (scoping, designing protocols, study set up), data collection and analysis, and dissemination.

Results: We reflect on why certain decisions were made and highlight facilitators and challenges. The manuscript concludes with 12 key lessons for conducting large-scale mixed-methods rapid evaluations of healthcare services. We propose that rapid study teams need to: (1) find ways of quickly building trust with external stakeholders, including evidence-users; (2) consider the needs of the rapid evaluation and resources needed; (3) use scoping to ensure the study is highly focused; (4) carefully consider what cannot be completed within a designated timeframe; (5) use structured processes to ensure consistency and rigour; (6) be flexible and responsive to changing needs and circumstances; (7) consider the risks associated with new data collection approaches of quantitative data (and their usability); (8) consider whether it is possible to use aggregated quantitative data, and what that would mean when presenting results, (9) consider using structured processes & layered analysis approaches to rapidly synthesise qualitative findings, (10) consider the balance between speed and the size and skills of the team, (11) ensure all team members know roles and responsibilities and can communicate quickly and clearly; and (12) consider how best to share findings, in discussion with evidence-users, for rapid understanding and use.

Conclusion: These 12 lessons can be used to inform the development and conduct of future rapid evaluations in a range of contexts and settings.