The conduct of biomedical research is highly resource intensive, but it is an effort deemed worthy of the expense because the end results—new knowledge, scientific evidence, and benefits to public health—can be trusted. Even when outright advances do not arise from an investigation, the product of research should be trustworthy enough, at the very least, to propel us further forward. The results should help us marshal additional resources and commitment to help us achieve our collective aims.

Challenges to Research Reporting

The contexts in which most research projects are conceptualized and carried out have their own sets of pressures, conflicts, and competing aims. Researchers’ careers and chances of future success can be greatly impacted by the outcomes of their projects, which naturally affects the manner in which research is reported. These effects occur in both predictable and unpredictable ways and can involve structural flaws. The list of potential structural flaws is long—encompassing cherry picking of results, lack of reporting of null results, and yes, even outright research misconduct.

An array of solutions exist to counteract inherent structural flaws in the research endeavor. With the level of investment involved and with the continued expansion of the research and publishing enterprises, there are many reasons to increase scrutiny. For an overview of how research has been reported over the past few decades, we encourage you to read “Let’s Make Peer Review Scientific,” a terrific overview and call to action by Drummond Rennie.

The intent of this article is to encourage your investigatory eye to be trained on how we can aim for excellence in the areas we can influence. For example, how can we reduce our bias as reviewers? How can we push for new forms of reporting that better support the transparency and openness of our research methodologies and practices? How can we report our research in a more standardized way? What can we do to report our findings (or lack thereof) such that their trustworthiness is only enhanced?

The article mentioned above ends with a note of hope that the 2017 Peer Review Congress, an international symposium on research into these matters, would be an important stepping stone on the path to improved reporting approaches across the board. We attended that meeting and came away encouraged by the depth of progress being made (for a deep dive on the research presented there, take a look at the plenary and poster abstracts).

Available Guidelines

In the Communication Sciences and Disorders (CSD) discipline, a basic first step is to report our research—when applicable—in a more standardized, structured way. For research of particular types, there are now well-developed guidelines in place that help ensure that studies are clear and that enough and the right kind of information is contained within reports to promote their reproducibility. The EQUATOR Network has become a greatly relied on repository for such guidance. For the nearly 400 different reporting guidelines housed there, you can find the research reporting on the assessment of their validity, as well as the checklists that many journals now encourage or require authors to complete and provide during manuscript submission.

For the ASHA journals, we now strongly encourage authors to make use of the EQUATOR resources for particular types of research, though provision of the checklists is not required at submission (see the Guidelines for Reporting Your Research on the ASHA Journals Academy for more information). A resource currently in development for the Academy is a set of training modules for peer reviewers so that they can best advise authors on how to refine the reporting of their research according to the applicable guideline specifications.

Another important step for our discipline’s improved rigor and reproducibility is to follow the best practices put forth by organizations such as the Center for Open Science. The Transparency and Openness Promotion (TOP) guidelines are increasingly recognized as the de facto standard for the publication of research. These best practice guidelines span design, analysis, and data citation/transparency. Even in the absence of a journal’s endorsement of such guidelines or a requirement to provide evidence of compliance, frameworks such as the TOP guidelines are worthy of researchers’ consideration as they conduct and report research.

Future Directions

As Rogers and Cannon (2017) have detailed, the further along this path that we travel, the more soundly we build our knowledge base. There is much to be done in how we conduct and report research, and there are more types of research that we should be putting into the knowledge base. Areas to address include the following:

  • “Dark data,” or materials that have value but that due to the pressures around publication do not become discoverable;
  • How to publish data so that it is accessible and usable for future researchers;
  • Null and negative results, for example via registered reports.

The more precisely and clearly we report the study design and results, the faster we can accelerate the pace of future generation, publication, and application of knowledge.

Moving Forward

As academics, researchers, editors, and publishers, we are on the front lines where advances in reporting guidelines can most effectively be supported and promoted. Likewise, we are the producers of much of the research itself. Although the study of how we might best report research and perform reviews has been going on for decades, we are at an inflection point where technology and demand are meeting head on. This is an exciting time that will require that CSD researchers make the entire process, not just peer review, more systematic and scientific—something that we are well trained and situated to achieve.

References

Rennie, D. (2016, July 7). Let’s make peer review scientific. Nature, 535, 31–33. https://doi.org/10.1038/535031a

Rogers, M. A., & Cannon, M. (2017). Influencing the culture of scholarly and professional communities to advance clinical research and accelerate knowledge translation. In M. Rice (Ed.), University research planning in the data era: Working with the levers and pulleys that tie together research information from big data to local details (pp. 68–82). Lawrence: The University of Kansas Merrill Advanced Studies Center. Available from http://merrill.ku.edu/mission-retreat-papers.