AHRQ: May 16, 2013
Background. To remain useful, comparative effectiveness reviews (CERs) and other systematic reviews require periodic updating. Although several studies have been conducted assessing when and how to update, no research has been conducted on optimal formats for presenting the results to users. The aim of the present study was to gather the input of various users of CERS regarding the usability of a range of formatting methods for showing the changes from the original to the update report.
Methods. Using the executive summaries of a comparative effectiveness review our Evidence- based Practice Center conducted in 2001 and the update review we conducted in 2008, we initially created five different versions of the update summary. Each succeeding version used a different format to show changes from the original to the update report (e.g., new and retired Key Questions, changes in search strategies and inclusion/exclusion criteria) and changes in the findings. To test the five differently formatted summaries, we identified several categories of users of CERs, convened an informal virtual focus group comprising various users, and asked them to evaluate the summaries on several dimensions, first via an email questionnaire and then in a group conference call where we presented the results of the questionnaire. Based on group feedback, we created two additional versions and tested them in a second focus group and among a third small group. The rationales for the selection of formats were two-fold: to imitate, and thus evaluate, the formats used by several organizations whose role is to conduct systematic reviews and updates and to create and test novel formats in response to users’ suggestions. Read more