Evidence synthesis communities and the future of evidence-informed conservation
Neal R Haddaway, Matthew J Grainger, Gavin Stewart
From computer science to public health, from sustainable agriculture to marine conservation, evidence synthesis is increasingly seen as a vital piece of the decision-making process. Evidence synthesis methodologies are developing rapidly, aided by new technologies that can make synthesis work more efficient. At the same time, communities of practice in synthesis methods are increasingly sharing and networking across sectors. Neal Haddaway shares some important developments that are helping to ensure that evidence-informed conservation is as rigorous, transparent and accessible as possible.
What is evidence synthesis?
Evidence syntheses are a suite of methods for identifying, describing, appraising, and synthesising a body of evidence on a topic. Evidence syntheses can take the form of evidence mapping (‘what research exists?’) or a full review (‘what works, when and why?’), and methods vary in their rigour and reliability (from scoping reviews to systematic reviews), and the kind of data being synthesised (quantitative, qualitative or mixed data). All methods share a common aim of helping the reader understand a large or disparate body of evidence.
How are evidence synthesis methods developing? How is technology affecting evidence synthesis?
Evidence synthesis began in the field of physcology in the ‘70s and ‘80s, but really took off in the field of clinical medicine. In the early ‘90s the Cochrane Collaboration (now Cochrane) aimed to consolidate and support synthesis in healthcare, shortly followed by the Campbell Collaboration in social policy in the late ‘90s and the Collaboration for Environmental Evidence in the mid ‘00s. In the last few decades, evidence synthesis methods have developed rapidly and include: non-randomised experimental designs; qualitative evidence synthesis; rapid review methods; evidence mapping; and Bayesian evidence synthesis, amongst many others.
In recent years, technology has begun to help reviewers deal with the increasing volumes of primary literature being published. These tools include: review management tools specifically designed for evidence synthesis (e.g. www.sysrev.com); search strategy development support (e.g. litsearchr); mapping visualisation tools (e.g. EviAtlas); machine learning for article eligibility assessment (e.g. ASReview). At the same time, open collaborative groups have developed with the objective of highlighting, producing, and testing emerging tools (e.g. Evidence Synthesis Hackathon and Systematic Review Toolbox). These technologies aim to improve the accessibility, efficiency, transparency, and rigour of evidence synthesis methods.
What are evidence synthesis communities of practice and why are they important?
Communities of practice (CoP) in evidence synthesis are groups of researchers, methodologists and advocates who work together to share, use and develop methods, tools and frameworks for rigorous evidence synthesis. These CoP exist both within disciplines (e.g. Cochrane in health) and across disciplines (e.g. the Global Evidence Synthesis Initiative for the Global South), and range in focus from the entire evidence-informed decision-making process (e.g. the Joanna Briggs Institute) to specific, focused domains (e.g. evidence synthesis technology for the International Collaboration for the Automation of Systematic Reviews).
These communities are vital for sharing experience and expertise, for collaborative methodology development, and for supporting one another in the shared goals and values of evidence-informed decision-making.
What does the future of evidence synthesis look like?
Evidence synthesis is increasingly appreciated as a valuable research endeavour and vital cornerstone in evidence-informed decision-making, so we will see more and more evidence syntheses published in our field. On top of that, many people turned to evidence synthesis when field work was cancelled during the pandemic.
As awareness increases, there is a continued need to build capacity for rigorous syntheses, since they’re easy to get wrong: many people appreciate the value of the term ‘systematic review’ for example, but use the term inappropriately, or develop novel methods and terms that miss some of the vital processes necessary to ensure syntheses are rigorous.
Technology will increasingly support evidence syntheses as reviewers have to deal with larger and larger evidence bases. But we must ensure that we are making use of the best evidence in conservation decision-making: we must continue to develop and make use of the best available methods for producing, synthesising and using high quality research. But more than this, we have to ensure that we work as connected communities of practice: aware of the state-of-the-art, using validated and robust methods, and sharing experiences and resources across disciplines and traditions. We can learn so much from one another, but we must be open to learn.
Reading list
- Glass GV, Smith ML (1979) Meta-analysis of research on the relationship of class-size and achievement. Educational Evaluation and Policy Analysis 1: 2-16.
2. Thomson, H., Craig, P., Hilton-Boon, M. et al. Applying the ROBINS-I tool to natural experiments: an example from public health. Syst Rev 7, 15 (2018). https://doi.org/10.1186/s13643-017-0659-4
3. Cooke, A., Smith, D. and Booth, A., 2012. Beyond PICO: the SPIDER tool for qualitative evidence synthesis. Qualitative health research, 22(10), pp.1435-1443.
4. Garritty, C., Gartlehner, G., Nussbaumer-Streit, B., King, V.J., Hamel, C., Kamel, C., Affengruber, L. and Stevens, A., 2020. Cochrane Rapid Reviews Methods Group offers evidence-informed guidance to conduct rapid reviews. Journal of Clinical Epidemiology.
5. James, K.L., Randall, N.P. and Haddaway, N.R., 2016. A methodology for systematic mapping in environmental sciences. Environmental evidence, 5(1), pp.1-13.
6. Sutton, A.J. and Abrams, K.R., 2001. Bayesian methods in meta-analysis and evidence synthesis. Statistical methods in medical research, 10(4), pp.277-303.
7. van de Schoot, R., de Bruin, J., Schram, R. et al. An open source machine learning framework for efficient and transparent systematic reviews. Nat Mach Intell 3, 125–133 (2021). https://doi.org/10.1038/s42256-020-00287-7
8. https://www.nature.com/articles/s41559-020-01295-x.epdf?sharing_token=zoZnCJTNaz6DDvvE7iGdNtRgN0jAjWel9jnR3ZoTv0ON4r8OCnZqaVmkofN026IE5skZbupsIK9uAGXMLeYZ7P8NQr68sW-TKi0QzLwjFQ8EABsIjHvV3udXbVRGRo8GQXSJyU6TEPDUKdYLXzTMwgqGorA13Uj_TpKdMFslKmA%3D