↓ Skip to main content

Rapid Review Summit: an overview and initiation of a research agenda

Overview of attention for article published in Systematic Reviews, September 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (89th percentile)
  • High Attention Score compared to outputs of the same age and source (92nd percentile)

Mentioned by

1 policy source
20 tweeters


25 Dimensions

Readers on

52 Mendeley
Rapid Review Summit: an overview and initiation of a research agenda
Published in
Systematic Reviews, September 2015
DOI 10.1186/s13643-015-0111-6
Pubmed ID

Julie Polisena, Chantelle Garritty, Craig A. Umscheid, Chris Kamel, Kevin Samra, Jeannette Smith, Ann Vosilla


The demand for accelerated forms of evidence synthesis is on the rise, largely in response to requests by health care decision makers for expeditious assessment and up-to-date information about health care technologies and health services and programs. As a field, rapid review evidence synthesis is marked by a tension between the strategic priority to inform health care decision-making and the scientific imperative to produce robust, high-quality research that soundly supports health policy and practice.In early 2015, the Canadian Agency for Drugs and Technologies in Health convened a forum in partnership with the British Columbia Ministry of Health, the British Columbia Centre for Clinical Epidemiology and Evaluation, the Ottawa Hospital Research Institute, and the University of Pennsylvania. More than 150 evidence synthesis producers and end users attended the Rapid Review Summit: Then, Now and in the Future. The Summit program focused on the evolving role and practices of rapid reviews to support informed health care policy and clinical decision-making, including the uptake and use of health technology assessment.Our discussion paper highlights the important discussions that occurred during the Rapid Review Summit. It focuses on the initial development of a research agenda that resulted from the Summit presentations and discussions. The research topics centered on three key areas of interest: (1) how to conduct a rapid review; (2) investigating the validity and utility of rapid reviews; and (3) how to improve access to rapid reviews.

Twitter Demographics

The data shown below were collected from the profiles of 20 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 52 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 2 4%
United States 1 2%
Unknown 49 94%

Demographic breakdown

Readers by professional status Count As %
Researcher 11 21%
Other 6 12%
Student > Ph. D. Student 5 10%
Student > Master 4 8%
Librarian 4 8%
Other 9 17%
Unknown 13 25%
Readers by discipline Count As %
Medicine and Dentistry 12 23%
Nursing and Health Professions 7 13%
Social Sciences 4 8%
Economics, Econometrics and Finance 3 6%
Computer Science 3 6%
Other 9 17%
Unknown 14 27%

Attention Score in Context

This research output has an Altmetric Attention Score of 16. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 08 May 2019.
All research outputs
of 20,927,597 outputs
Outputs from Systematic Reviews
of 1,819 outputs
Outputs of similar age
of 263,406 outputs
Outputs of similar age from Systematic Reviews
of 13 outputs
Altmetric has tracked 20,927,597 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 91st percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,819 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 12.6. This one has done well, scoring higher than 82% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 263,406 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 89% of its contemporaries.
We're also able to compare this research output to 13 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 92% of its contemporaries.