↓ Skip to main content

Data for improvement and clinical excellence: report of an interrupted time series trial of feedback in long-term care

Overview of attention for article published in Implementation Science, November 2014
Altmetric Badge

About this Attention Score

  • Above-average Attention Score compared to outputs of the same age (62nd percentile)

Mentioned by

twitter
6 tweeters

Citations

dimensions_citation
5 Dimensions

Readers on

mendeley
32 Mendeley
Title
Data for improvement and clinical excellence: report of an interrupted time series trial of feedback in long-term care
Published in
Implementation Science, November 2014
DOI 10.1186/s13012-014-0161-5
Pubmed ID
Authors

Anne E Sales, Corinne Schalm, Melba Andrea B Baylon, Kimberly D Fraser

Abstract

BackgroundThere is considerable evidence about the effectiveness of audit coupled with feedback for provider behavior change, although few feedback interventions have been conducted in long-term care settings. The primary purpose of the Data for Improvement and Clinical Excellence-Long-Term Care (DICE-LTC) project was to assess the effects of a feedback intervention delivered to all direct care providers on resident outcomes. Our objective in this report is to assess the effect of feedback reporting on rates of pain assessment, depression screening, and falls over time.MethodsThe intervention consisted of monthly feedback reports delivered to all direct care providers, facility and unit administrators, and support staff, delivered over 13 months in nine LTC units across four facilities. Data for feedback reports came from the Resident Assessment Instrument Minimum Data Set (RAI) version 2.0, a standardized instrument mandated in LTC facilities throughout Alberta. The primary evaluation used an interrupted time series design with a comparison group (units not included in the feedback intervention) and a comparison condition (pressure ulcers). We used segmented regression analysis to assess the effect of the feedback intervention.ResultsThe primary outcome of the study, falls, showed little change over the period of the intervention, except for a small increase in the rate of falls during the intervention period. The only outcome that improved during the intervention period was the proportion of residents with high pain scores, which decreased at the beginning of the intervention. The proportion of residents with high depression scores appeared to worsen during the intervention.ConclusionsMaintaining all nine units in the study for its 13-month duration was a positive outcome. The feedback reports, without any other intervention included, did not achieve the desired reduction in proportion of falls and elevated depression scores. The survey on intention to change pain assessment practice which was conducted shortly after most of the feedback distribution cycles may have acted as a co-intervention supporting a reduction in pain scores. The processing and delivery of feedback reports could be accomplished at relatively low cost because the data are mandated and could be added to other intervention approaches to support implementation of evidence-based practices.

Twitter Demographics

The data shown below were collected from the profiles of 6 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 32 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 1 3%
Unknown 31 97%

Demographic breakdown

Readers by professional status Count As %
Researcher 7 22%
Student > Ph. D. Student 5 16%
Professor 4 13%
Student > Master 4 13%
Unspecified 3 9%
Other 9 28%
Readers by discipline Count As %
Medicine and Dentistry 12 38%
Psychology 7 22%
Unspecified 5 16%
Nursing and Health Professions 4 13%
Social Sciences 3 9%
Other 1 3%

Attention Score in Context

This research output has an Altmetric Attention Score of 3. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 21 December 2014.
All research outputs
#5,982,417
of 11,344,222 outputs
Outputs from Implementation Science
#916
of 1,201 outputs
Outputs of similar age
#75,675
of 208,527 outputs
Outputs of similar age from Implementation Science
#41
of 57 outputs
Altmetric has tracked 11,344,222 research outputs across all sources so far. This one is in the 46th percentile – i.e., 46% of other outputs scored the same or lower than it.
So far Altmetric has tracked 1,201 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 11.9. This one is in the 22nd percentile – i.e., 22% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 208,527 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 62% of its contemporaries.
We're also able to compare this research output to 57 others from the same source and published within six weeks on either side of this one. This one is in the 26th percentile – i.e., 26% of its contemporaries scored the same or lower than it.