Do we need to search Web of Science citation indexes for systematic reviews?
Before you judge, I am not sure if the plural form is citation indices or citation indexes, so if you have studied English, Help! [Thanks to Liz on Twitter quoting from “Index, A History of the” that indices are for mathematicians & economists; indexes are what you find at the back of a book].
When did WoS start to be seriously considered?
Just like Scopus and Google Scholar, WoS Core Collection databases are not specific to health/medical sciences. As a result, they have not been considered a primary search source for healthcare systematic reviews.
The only use of WoS, Scopus and Google Scholar was to check the citations to the included studies to identify more relevant studies — so-called forward citation searching. Alongside backward citation search they may be called pearl growing, or snowballing.
The rumours started after Bramer et al. 2017 put the Web of Science (WoS) on the map of databases to be searched for systematic reviews. Before their study, searching WoS for a healthcare systematic review was as subjective as searching Scopus or Google Scholar. Furthermore, MECIR added a ‘highly desirable’ item for conference abstracts to the list of standards. BUT, did we correctly interpret Bramer et al.’s paper and MECIR’s item?
Don’t decide before finishing this post.
A. Critical Appraisal of Bramer et al. 2017 for practice
While I consider this research to help in changing the subjective practice of choosing databases to a more objective practice, just like any other study, it should be critically appraised and interpreted in its context. Here are considerations in interpreting the results
- This research was based on 58 reviews.
- The reviews were selected if the first author’s affiliation was a specific organisation (i.e., Erasmus MC).
- Since the searches for the above reviews were often librarian-mediated, the search methods and the list of databases used for reviews published have been affected by recommendations from the institutional librarians.
- Access to a source could cause bias in choosing the databases for systematic reviews. Librarians are interested in making the user community aware of the paid resources so the users can benefit or use them to create value from the subscription fees. Just because we have access to a source does not mean we must search it for systematic reviews. Institutional librarians likely recommend searching WoS just because they have access to this source.
- This study does not report the list of databases in these 58 reviews [this may be possible by issuing an erratum to the paper]. Searching Web of Science would likely retrieve new unique, relevant records, so if WoS was searched in these reviews, there would be unique, relevant records from this source. As would Scopus, Google, and Google Scholar — more or less.
- Last but not least, the authors conclude that including WoS will likely add value to the review and by value, they likely meant identifying the new unique, relevant records. They never claimed that adding new unique records would change the conclusion or generalizability of the review. Adding WoS — probably just like adding other general bibliographic databases such as Scopus — could increase the comprehensiveness of the included records (not necessarily studies). Whether or not such practice is cost-saving or resource-intensive (time, human and monetary resources) and wasteful is unclear.
- The Web of Science Core Collection may contain a variable number of databases. Each institute could get a subscription to between 1 to 6 citation indexes and other databases via Web of Science. The authors of this study did not report the list of databases in the Web of Science Core Collection [this may be corrected by issuing an erratum to the paper]. So it is unclear what combination of the following databases led to the conclusion of the paper:
- Science Citation Index Expanded (SCIE)
- Social Sciences Citation Index (SSCI)
- Arts & Humanities Citation Index (AHCI)
- Emerging Sources Citation Index (ESCI)
- Conference Proceedings Citation Index — Science (CPCI-S)
- Conference Proceedings Citation Index — Social Sciences and Humanities (CPCI-SSH)
- Book Citation Index (BKCI) — Science
- Book Citation Index (BKCI) — Social Sciences and Humanities
- Current Chemical Reactions
- Index Chemicus
B. MECIR’s Highly Desirable item: WoS for Citation Tracking vs Searching Conference Proceedings
What I’m discussing here might be relevant to both MECIR and Cochrane Handbook for Systematic Reviews of Interventions which are linked and refer to each other.
- The search chapter in Cochrane Handbook does not recommend searching WoS directly; however, it talks about using citation indexes to track citations of studies to find more studies; a practice that has been in place for years using WoS, a leading source, and then using Scopus and Google Scholar — both launched in 2004.
- The handbook also refers to MECIR’s highly desirable item on searching for conference proceedings/abstracts. So this might be relevant to WoS. Although MECIR mentioned checking references of included studies (backward citation searching), it does not explicitly refer to forward citation searching.
- Even if we consider that WoS has been recommended subtly, it is not a mandatory item; it could be interpreted as desirable on its own or highly desirable for searching the conference abstracts.
- Looking at the technical supplement for the search chapter in the handbook, we see references to WoS in the Citation Indexes and Conference proceedings sections of the supplement. When it is in the Citation Indexes section, it’s been mentioned alongside Scopus, Google Scholar, and OpenAlex (I wonder where this came from)! And when it is in the Conference Abstracts section, it is mentioned alongside Embase, CENTRAL, and Scopus. To make these sources comparable, the supplement falls into the industries’ commercial advertisement trap and uses the number of records in millions. These ‘millions’ are for minions and do not mean anything in a systematic review context or guide us. We have no evidence these numbers are correct or how to interpret them:
- 80 million records for WoS, which could include any non-medical field records and count all references from the bibliographies of indexed papers
- 10 million conference abstracts for Scopus, which is more from engineering, computer and physics as these fields — unlike medicine — tend to publish their conference proceedings as full papers rather than abstracts only
- 3.5 million conference abstracts for Embase (seems smaller by all medical and probably mostly published as supplementary issues of journals)
Conclusion
It doesn’t matter if it is Cochrane, a published peer-reviewed study, or an individual that makes a suggestion, recommendation or advice; we need to critically appraise the evidence before using it.
Upon availability of time and resources, Web of Science could be treated almost equally alongside Scopus and Google Scholar for forward citation tracking of the included studies. This is not a mandatory or highly desirable item in Cochrane guidelines.
Upon availability of time and resources, Web of Science Conferences proceedings citation indexes can also be searches alongside CENTRAL, Embase and/or Scopus to comply with highly desirable item of Cochrane MECIR for searching conference proceedings. Bear in mind this is not mandatory.
Revision: A colleague on Twitter also pointed out that sources such as WoS and Scopus could also be among the search sources when you are conducting an interdisciplinary or multidisciplinary systematic reviews.
If you liked this blog post, please support me by pressing the green Follow button and signing up so I can write more. Email Subscription Dysfunctions. Thank you :D