Hostname: page-component-586b7cd67f-dlnhk Total loading time: 0 Render date: 2024-11-27T20:56:05.622Z Has data issue: false hasContentIssue false

OP72 Software Tools For Systematic Literature Review In Medicine: A Review And Feature Analysis

Published online by Cambridge University Press:  23 December 2022

Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.
Introduction

Systematic reviews (SRs) are central to evaluating therapies but have high costs in time and money. Many software tools exist to assist with SRs, but most tools do not support the full process, and transparency and replicability of SR depends on performing and presenting evidence according to established best practices. In order to provide a basis for comparing between software tools that support SR, we performed a feature-by-feature comparison of SR tools.

Methods

We searched for SR tools by reviewing any such tool listed the Systematic Review Toolbox, previous reviews of SR tools, and qualitative Google searching. We included all SR tools that were currently functional, and required no coding and excluded reference managers, desktop applications, and statistical software. The list of features to assess was populated by combining all features assessed in four previous reviews of SR tools; we also added five features (manual addition, screening automation, dual extraction, living review, and public outputs) that were independently noted as best practices or enhancements of transparency/replicability. Then, two reviewers assigned binary ‘present/absent’ assessments to all SR tools with respect to all features, and a third reviewer adjudicated all disagreements.

Results

Of 53 SR tools found, 29 were excluded, leaving 24 for assessment. Thirty features were assessed across six classes, and the inter-observer agreement was 86 percent. DistillerSR (Evidence Partners; n = 26/30, 87%), Nested Knowledge (Nested Knowledge; n = 25/30, 83%), and EPPI-Reviewer Web (EPPI-Centre; n = 24/30, 80%) support the most features followed by Giotto Compliance (Giotto Compliance; n = 23/30, 77%), LitStream (ICF; n = 22/30, 73%), and SRDB.PRO (VTS Software; n = 21/30, 70%). Seven tools support fewer than half of all features assessed: RobotAnalyst, SyRF, Data Abstraction Assistant, SWIFT-Review, SR-Accelerator, RobotReviewer, and COVID-NMA. Notably, only 10 tools (42%) support direct search, 7 (29%) offer dual extraction, and 13 (54%) offer living/updatable reviews.

Conclusions

DistillerSR, EPPI-Reviewer Web, and Nested Knowledge each offer a high density of SR-focused web-based tools. By transparent comparison and discussion regarding SR tool functionality, the medical community can choose among existing software offerings and note the areas of growth needed, most notably in the support of living reviews.

Type
Oral Presentations
Copyright
© The Author(s), 2022. Published by Cambridge University Press