Article Text

Download PDFPDF

Original research article
Trial registration and adherence to reporting guidelines in cardiovascular journals
Free
  1. Matt Thomas Sims,
  2. Aaron Marc Bowers,
  3. Jamie Morgan Fernan,
  4. Kody Duane Dormire,
  5. James Murphy Herrington,
  6. Matt Vassar
  1. College of Osteopathic Medicine, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
  1. Correspondence to Aaron Marc Bowers, Oklahoma State University Center for Health Sciences, Tulsa, OK 74074, USA; Aaron.Bowers{at}okstate.edu

Abstract

Objective This study investigated the policies of cardiac and cardiovascular system journals concerning clinical trial registration and guideline adoption to understand how frequently journals use these mechanisms to improve transparency, trial reporting and overall study quality.

Methods We selected the top 20 (by impact factor) journals cited in the subcategory ‘Cardiac and Cardiovascular Systems’ of the Expanded Science Citation Index of the 2014 Journal Citation Reports to extract journal policies concerning the 17 guidelines we identified. In addition, trial and systematic review registration adherence statements were extracted. 300 randomised controlled trials published in 2016 in the top 20 journals were searched for clinical trial registry numbers and CONSORT diagrams.

Results Of the 19 cardiac and cardiovascular system journals included in our analysis, eight journals (42%) did not require or recommend trial or review registration. Seven (37%) did not recommend or require a single guideline within their instructions to authors. Consolidated Standards for Reporting Trials guidelines (10/19, 53%) were recommended or required most often. Of the trials surveyed, 122/285 (42.8%) published a CONSORT diagram in their manuscript, while 236/292 (80.8%) published a trial registry number.

Discussion Cardiac and cardiovascular system journals infrequently require, recommend or enforce the use of reporting guidelines. Furthermore, too few require or enforce the use of clinical trial registration. Cardiology journal editors should consider guideline adoption due to their potential to limit bias and increase transparency.

  • statistics and study design
  • medical ethics
  • research approaches

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

‘The consequence of poorly reported findings is the potential to cause real harm. Readers of the scientific literature deserve to know that editors, reviewers, and authors have adopted processes that foster clarity and replication.’1

An estimated US$240 billion is spent annually on health research,2 resulting in 3 million research articles each year, published in thousands of scientific journals.3 Reporting guidelines are checklists or flow diagrams designed to ensure research is reported in a uniform and practical way. The guidelines help readers compare studies to one another in an ‘apples to apples’ fashion that was rarely possible before.4 Reporting guidelines are also useful in decreasing risk of bias in research by ensuring that results are fully published and the methods are reproducible.5 Despite the success and utility of reporting guidelines, medical journals are reluctant to recommend or require them.6–9 For example, Meerpohl et al found that less than 10% of paediatrics journals endorsed the use of reporting guidelines.9

Another mechanism for minimising the risk of bias is clinical trial registration. The Food and Drug Administration (FDA) Amendments Act (FDAAA) of 2007 requires that all clinical trials performed in the USA be registered with ClinicalTrials.gov before the first individual is enrolled.10 Additionally, the WHO has released a statement in support of trial registration.11 Despite this strong stance, Mathieu et al found that more than 54% of clinical trials in cardiology, rheumatology and gastroenterology were inadequately registered.12 According to journal editors, proper trial registration is the most valuable tool to ensure unbiased reporting of results.13

The field of cardiology is not immune to deficiencies in clinical trial registration and reporting guidelines. Kelly et al found that the checklist items for Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) were inadequately reported 51% of the time in 66 cardiovascular systematic reviews.14 A study by Chang et al found that 51% of clinical trials using novel high-risk cardiovascular devices remained unreported more than 2 years after approval by the FDA.15 The aim of this study was to examine the reporting guidelines and trial registration policies within cardiac and cardiovascular system journals. We sought to determine to what degree these mechanisms were being used, and if these policies are effective.

Methods

Our study reviewed journal policies concerning trial registration requirements and guideline adherence. Our study does not fit the definition in 45 CFR 46.102(d) and (f) of the Department of Health and Human Services’ Code of Federal Regulations and is not subject to oversight by Institutional Review Board. The Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines for reporting descriptive statistics were applied.16 Our study registration is found on the University Hospital Medical Information Network Clinical Trial Registry (UMIN-CTR, UMIN000024082).

Methods for our study were adapted from similar studies done in other fields.7 8 We selected the top 20 (by impact factor) journals cited in the subcategory ‘Cardiac and Cardiovascular Systems’ of the Expanded Science Citation Index of the 2014 Journal Citation Reports (Thomson Reuters: New York, NY) accessed on 11 July 2016. Web-based searches for each journal were performed (22 August 2016) by the first author (MTS) in order to locate the submission guidelines for authors. The editor in chief of each journal was emailed by MTS to obtain a record of the study designs reviewed for publication (diagnostic accuracy studies, animal research, clinical trials, case reports, observational studies in epidemiology, economic evaluations, qualitative research studies, systematic reviews/meta-analyses, quality improvement studies and study protocols). For non-responding editors, emails were sent once a week for 3 weeks in order to increase response rates by editors in chief.

MTS catalogued journal title and impact factor for each journal as of the date the top 20 journals were identified, 22 August 2016. Next, coauthors (AMB, JMF, KDD) reviewed the instructions and policies related to manuscript submission (hereafter referred to as ‘instructions for authors’). Each of the journal’s adherence statements for reporting guidelines detailed in table 1 were extracted. Additionally, the International Committee of Medical Journal Editors (ICMJE) journal membership, trial and review registration adherence statements were also extracted.

Table 1

Reporting guidelines by study type

AMB, JMF and KDD individually examined each statement concerning use of reporting guidelines and trial registration. Statements were rated according to the strength of the endorsement. Categories were limited to: required, recommended or failed to mention. The rating of ‘recommended’ included the following: ‘should’, ‘prefer’, ‘encourage’ or ‘in accordance to the recommendation of’. The rating of ‘required’ included the following: ‘must’, ‘need’ or ‘manuscripts won’t be considered for publication unless’.8 After the rating process was completed, MTS compared ratings and resolved discrepancies. STATA V.13 (StataCorp; College Station, TX) was used in this study’s statistical analysis of the data. During the statistical analysis of the data, study types not published by a journal were excluded when calculating percentages. For instance, if 4 of 20 journals did not publish case reports then the CARE Guidelines’ percentages would be calculated out of 16 journals.

AMB next performed a search of PubMed by publication type ‘randomized controlled trial’ for the same 20 journals on 8 August 2017. This method has been shown to have a sensitivity and specificity over 93% and a relatively high precision for correctly returning randomised trials.17 The date range was selected to be 1 January 2016 to 31 December 2016. These trials were divided based on whether journals required trial registration and whether journals recommended or required CONSORT in their instructions for authors. AMB and MTS searched these publications for trial registration information and a published CONSORT diagram.

Results

Our sample was composed of the top 20 cardiac and cardiovascular system journals by impact factor (4.638–17.759, 8.156±4.085). For each study type, the appropriate guideline was identified (table 1). Editor in chief email inquiries resulted in a response rate of 45.0% (9/20). We excluded Nature Reviews Cardiology from our analysis because they do not accept original research. STARD (1/19, 5%), CARE (2/19, 11%), TRIPOD (1/19, 5%), CHEERS (1/19, 5%), COREQ (2/19, 11%), SRQR (2/19, 11%), SQUIRE (1/19, 5%), PRISMA-P (4/19, 21%) and SPIRIT (4/19, 11%) were removed from the statistical analysis because the journals did not accept the specific study type (table 2).

Table 2

Reporting guidelines and trial registries endorsed by cardiology journals

Reporting guidelines

The EQUATOR Network was referenced in the instructions for authors of two (11%) journals. For the journals that recommended the use of the EQUATOR Network, authors assessed the journal as recommending the use of all guidelines included in the EQUATOR Network. The authors’ guidelines of 18 (95%) journals referenced the ICMJE uniform requirements for manuscripts. Of the 19 cardiac and cardiovascular system journals, 7 (37%) did not contain an adherence statement for any of the reporting guidelines. The remaining 12 journals (63%) recommended or required at least one reporting guideline.

Table 2 displays reporting guideline utilisation. In our journal sample, the CONSORT statement (10/19, 53%) was most frequently required (1/19, 5%) and recommended (9/19, 47%). The STARD guidelines (7/18, 39%) were the second most frequently mentioned, followed by the MOOSE guidelines (7/19, 37%). The QUOROM statement was not mentioned by any journals (figure 1).

Figure 1

Frequency of reporting guideline mentioned across journals.

Clinical trial and systematic review registration

Of the 19 cardiac and cardiovascular system journals, 8 (42%) did not contain adherence statements for trial or review registration. The remaining 11 journals (58%) mentioned one or both. Ten (53%) journals required trial registration and one (5%) journal recommended trial registration through any trial registry. Clinical trial registry on ClinicalTrials.gov was cited by eight (42%) journals: required by one journal and recommended by seven journals. Trial registry on the WHO’s platform was recommended by seven (37%) journals. One (5%) journal recommended systematic review registry on the PROSPERO platform (figure 2).

Figure 2

Frequency of registration recommendation and requirement across journals.

Adherence to trial registration and CONSORT from a sample of randomised trials

Our PubMed search returned 300 results (figure 3). Of these, six were not randomised controlled trials. Nine trials were not available in full text, although seven reported a trial registry number in the abstract. Searches of The Journal of Cardiovascular Magnetic Resonance, The Journal of Cellular and Molecular Cardiology and Basic Research Cardiology failed to return any randomised controlled trials. Of the remaining 16 journals, 10 recommended or required trial registration. These 10 journals reported the trial registry number 83.6% (148/177) of the time. The six journals that did not mention trial registry reported a trial registry number 76.5% (88/115) of the time. This result was not statistically significant (P=0.12); however, a visual inspection of our data revealed a possible outlier that required further investigation. To accomplish this, we calculated the registration percentage of each journal that provided guidance on trial registration. Among these journals with five or more published trials from our sample, eight of the nine journals had trial registration rates over 80% (figure 4). The one outlier was The International Journal of Cardiology which published a trial registry number only 61% (25/41) of the time. With the outlier removed, the nine journals requiring or recommending trial registration report a registry number 90.4% (123/136) of the time, significantly higher than the journals that do not mention trial registration (P=0.002). Of the four journals with five or more published trials, only JACC Cardiovascular Interventions included a trial registry number in the publication more than 80% of the time.

Figure 4

Frequency of published trial registry number.

We found nine journals that recommended or required use of the CONSORT guideline. Among these 9 journals, 168 trials were surveyed and 68 (40.4%) CONSORT diagrams were published. Among the 7 journals that did not mention the CONSORT guideline, 54/117 (46.1%) published a CONSORT diagram in the manuscript. This difference was not found to be statistically significant (P=0.3). Of the eight journals that published five or more randomised controlled trials and required or recommended use of the CONSORT guideline, only Circulation Cardiovascular Imaging published a CONSORT diagram more than 80% of the time (figure 5). Of the four journals that published five or more trials and did not mention the CONSORT guideline, none published a CONSORT diagram more than 80% of the time.

Figure 5

Frequency of published CONSORT diagram.

Discussion

The aim of our study was to survey guidelines used in the top cardiovascular journals. Seven of the 19 journals did not include adherence statements for any of the reporting guidelines found in table 1. These statements and checklists were created to ensure accurate reporting of research, give a template for repeat studies and ensure that good research is not lost to publication because of weak writing.18

By far, the most referenced guideline was the ICMJE uniform requirements for manuscripts, which was mentioned by 18/19 journals. The ICMJE was created to improve the quality and transparency of medical research,19 and it provides guidance for conflicts of interest, reporting of results and manuscript editing. The ICMJE mandates that member journals do not accept manuscripts for publication unless the clinical trials were registered in a public registry before the first participant was enrolled.20 The ICMJE has been very effective at increasing clinical trial registration. ICMJE member journals have a 96% trial registration rate, compared with just 39% for all other journals.21 22 Across the ICMJE journals surveyed, we found that trial registry numbers were published only 80.2% of the time, indicating a deficiency in adherence.

Eight (42%) of the journals surveyed made no mention of clinical trial registration in any form. Clinical trial registration is designed to hold researchers accountable for the findings of their study, whether or not they are favourable. Failure to properly register a clinical trial leaves the door open for inaccurate or ‘cherry-picked’ evidence finding its way into clinical decision-making. Our study found four journals that published five or more trials in 2016 and made no mention of trial registry. Three of the four journals had trial registry numbers in fewer than 80% of their manuscripts. Nine journals from our sample recommended or required trial registry and published five or more trials in 2016. Eight of the nine journals published trial registry numbers more than 80% of the time, while The International Journal of Cardiology published a trial registry number only 61% of the time. Journal editors have the power to improve research transparency. Requiring trial registration and enforcing that policy is an important step in that direction.

Two of the 19 journals that accept meta-analyses and systematic reviews recommend the use of PRISMA, and none require the use of QUOROM. The QUOROM statement was developed in 1999 to improve the quality of systematic reviews and meta-analyses.23 In 2005, the PRISMA statement was developed to expand and improve the QUOROM checklist by making the research more transparent.24 Given that the QUOROM statement is now obsolete, no journal is expected to recommend it. The PRISMA statement is recognised as the gold standard for meta-analyses and systematic reviews, and it is recommended at a much higher rate by journals of other medical specialties.6–8

Eight guidelines (SPIRIT, PRISMA-P, SQUIRE, CARE, CHEERS, COREQ, STROBE, TRIPOD) were recommended by only two journals each. These are lesser used guidelines that are available, along with many others, on the EQUATOR network. EQUATOR was developed to promote uniformity and transparency in health research reporting by increasing access to guidelines.25 To date, the network has catalogued 360 reporting guidelines and provides a simple algorithm to give authors and editors an easy way to find which reporting guidelines they should use for their study.26 Only 2 of the 19 journals endorsed the use of EQUATOR in their instructions for authors.

CONSORT is one of the most studied, used and cited reporting guidelines.27 It was endorsed by 53% (10/19) of the cardiology journals we surveyed, similar to the number of journals of other specialties that recommended it.6–8 Moher et al found that of the journals that adopted CONSORT, reporting of randomised control trials (RCT) improved more than adopting of any other guideline.27 Despite widespread use and benefit, use of CONSORT in cardiology RCTs is lacking. Zheng et al found a mean CONSORT score of 64% among RCTs studying pharmacologic therapies of heart failure with preserved ejection fraction.28 The number of trials in our sample was similarly poor, with only 42.8% of trials publishing a CONSORT diagram. Rates did not significantly differ between journals that endorsed the CONSORT statement (40.7%, 68/168) versus those that made no mention (46.1%, 54/117).

Inadequate practices in reporting create a barrier for readers to compare and understand pertinent results. Hirst and Altman suggest three possible reasons why reporting guidelines are not being used: (1) a lack of awareness of what the guidelines are, (2) uncertainty of their usefulness, and (3) confusion on how to use them.29 The EQUATOR Network is specifically designed to address all three of these concerns. Fuller et al found that if authors believed the use of reporting guidelines would increase the chance of publication in a high-impact journal, they were more likely to adhere to them.30 We recommend that journal editors update the instructions for authors for their journal to include reporting guidelines, the EQUATOR Network and clinical trial registration. Additionally, this section of a journal’s website should be easily located, be clear about the types of studies accepted by the journal and set the expectation that articles follow reporting guidelines and clinical trial protocols. We recommend authors to use the EQUATOR Network to determine which guideline they should use and strive to follow the highest standards of reporting. Doing so will ensure that journal readers can make an informed evaluation of the research that will shape future decision-making in the field of cardiology.

Limitations

We sought to acquire each editor in chief’s clarification on instructions for authors in order to ensure the accuracy of our interpretations. Many were unresponsive to email inquiries; therefore, we did not receive verification of their requirements or the type of studies their journal accepts. We were not able to obtain full-text access for nine randomised controlled trials via our library. However, we believe this missing information is unlikely to significantly change the results of our study.

Key messages

What is already known on this subject?

Reporting guidelines improve the quality and transparency of research articles. Trial reporting minimises risk of bias infiltrating the decision-making process in clinical practice. Endorsement of clinical trial reporting and use of reporting guidelines varies among medical specialties.

What might this study add?

Our study surveys the top journals in the field of cardiology, identifying which have the most rigorous standards for reporting research. We identify concerns with the ‘instructions for authors’ section of journal websites.

How might this impact on clinical practice?

Clinical research is the backbone of evidence-based medicine. By requiring utilisation of reporting guidelines and trial registration, research reporting will improve.

References

Footnotes

  • Contributors MTS developed the protocol, and contributed to data extraction, writing the manuscript, making tables, editing and revision. AMB contributed to data extraction, making figures, writing, revising and submitting the manuscript. KDD and JMF extracted data. JH and MV oversaw the project and contributed to writing and edits.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.