ISSN:
1569-8041
Keywords:
oncology
;
review articles
;
systematic menthodology
Source:
Springer Online Journal Archives 1860-2000
Topics:
Medicine
Notes:
Abstract Purpose: Review articles are an important source of summary informationfor practising clinicians to assist them in remaining current with therapidly expanding medical literature. Consequently it is essential thatthese be of the highest quality. In this study we evaluate, according topublished criteria, the methodological quality of review articles (R)including meta-analyses (MA) appearing in a major cancer journal, Journal ofClinical Oncology (JCO), 1983–1995. Methods: A hand-search of JCO was performed, from the first issue January1983 through December 1995, to identify R, defined as publications thatdescribe and comment on studies relevant to a specific topic or clinicalintervention. Only those dealing with aspects of treatment of human cancerwere considered further. Methodological quality was first assessed using 8criteria proposed by Mulrow [1], rated independently by two medicaloncologists as: specified, unclear or not specified. MA, including studiesof dose intensity, were further analyzed according to 23 more detailedcriteria proposed by Sacks et al. [2] and rated as adequate, partial orno/unknown compliance. Results: Of 176 review articles, 122 dealt with aspects of treatment ofcancer. Compliance with four of Mulrow‘s eight criteria was generally good,in that 99% clearly stated a purpose, all attempted qualitativesynthesis of data, 95% presented a summary and 76% consideredfuture directions. However, in the 106 qualitative reviews (QR), authorsrarely gave information on methods of data identification (11.3%),data selection (10.4%) and assessment of validity (8.4%).Structured abstracts seemed to improve the focus and clarity of QR and therewas a minor improvement in deficient areas in the later time cohort(1990–1995). Based on ‘adequate’ compliance with each of the 23criteria identified by Sacks et al. [2], six dose intensity studies scored7–12, seven literature data MA scored 10–15 and three individualpatient data MA scored 16–18. The highest scores were in the sectionsrelating to prospective design, combinability and statistical analysis.Factors relating to control of bias, sensitivity analysis and application ofresults were addressed less consistently. Conclusions: With the exception of MA, the majority of authors contributingreviews to a major cancer journal, JCO, did not use systematic methods toidentify, assess and synthesize information. Initiatives such as the CochraneCollaboration Cancer Network can support and educate clinicians who wish toperform systematic reviews, but quality of reviews would also improve ifauthor, editors and readers systematically applied any of the sets of criterianow available in the literature.
Type of Medium:
Electronic Resource
URL:
http://dx.doi.org/10.1023/A:1008269422459
Permalink