According to the World Health Organization (WHO,2010), interprofessional education (IPE) occurs “when two or more professions learn about, from and with each other to enable effective collaboration and improve health outcomes” (p. 13). ASHA-accredited programs are actively engaged in addressing the first part of this definition by incorporating IPE experiences into their programs (ASHA, 2019). However, these programs devote much less attention to the last part of the IPE definition. To date, IPE outcome reports focus on isolated program evaluations and learner outcomes and not on the impact of these efforts on collaborative practice (CP) or patients/clients and populations. We need to implement changes in how we approach interprofessional and collaborative practice (IPECP) research. Doing so will help us understand how IPE leads to effective collaborations in all work settings—and how these collaborations positively impact our patients/clients and educational and health care systems.

Reports from the Institute of Medicine (IOM, 2015) and InterprofessionalResearch.Global and Interprofessional.Global (Khalili et al., 2019)paint a dismal picture of our ability to connect our educational efforts with meaningful, sustainable patient and system outcomes. These reports describe our research shortcomings and provide recommendations to guide future efforts. In response to many of these concerns, Musaji and colleagues (2019) propose applying a translational model to IPECP research. This model includes identifying specific gaps along a continuum when (a) translating IPE to collaborative practice (CP) and (b) examining the results of CP. The process includes steps in identifying which IPE programs to use, improving these programs, and examining effects of these interventions in clinical settings and practice.

The following considerations are based on recommendations cited in the two previously mentioned reports (IOM, 2015; Khalili et al., 2019). These recommendations may be useful in implementing the translational models described by Musaji and colleagues (2019). For each consideration, basic precepts are summarized in italics and then are described.

Understand the Value and Limitations of Current Evidence

The use of effective models to evaluate IPE research may lead to a greater understanding of context (what works with whom, how, and why) and ultimately may lead to improved educational and collaborative practice. Context is particularly important as we consider the diversity of teams that have communication sciences and disorders (CSD) professionals across educational and health care employment settings. Currently, we turn to systematic reviews and meta-analyses of limited numbers of studies that meet review criteria to understand the impact of IPECP. For example, Reeves and colleagues (2017) found that IPE supports collaborative knowledge, skills, and attitudes but found limited evidence that it enhances CP or patient care. Many researchers suggest that, to better understand the “context–mechanism–outcome of interventions” described in these reports, we adopt a “realist”/realistic evaluation approach. In making their case for using such an approach, Hewitt and colleagues (2012) discuss the limitations of traditional systematic reviews and meta-analyses in informing educational practices and policies (e.g., meta-analyses often reduce programs to a single measure of effect and neglect to consider how outcomes vary for different participants). With a realist approach, rather than asking, “Does this IPE program work?” we ask, “What is it about this kind of intervention that works; for whom; and in what circumstances, in what respects, and why?”(Hewitt et al., 2012, p. 251). To support the use a realist approach, IPE research models can consider examining and/or controlling for potential factors that may impact outcomes (e.g., profession, learner level, prior experiences).

As we work to implement evidence-based science in IPE, we should not discount the value of team science in informing our educational and clinical practices. The robust body of evidence and theory describing effective teams and how to develop them spans decades. This information supports current efforts in developing evidence-based team training (see, e.g., Salas et al., 2018, and Tannenbaum & Salas, 2021).

Use Theoretical Frameworks to Drive Curriculum Development and Assessment

Without theoretical frameworks to support the development, implementation, and assessment of pre- and post-certification IPE, we will perpetuate an ineffective, economically draining “patchwork approach” to IPE. The IOM and Global IPE research reports stress the importance of adopting IPE theoretical frameworks that address a learning continuum to support pedagogically sound and effective curricula with desired proximal and distal outcomes. We have made much progress in IPE toward understanding the value of theory in connecting concepts, organizing complex phenomena, explaining outcomes, and generating practical implications (Reeves & Hean, 2013). Theories that relate to the learning process (e.g., adult learning, psychodynamic theory, contact theory, identity theory) and the learning context (e.g., sociology of the professions, general systems theory, organizational theory) enhance our ability to create, deliver, and explain IPE experiences and outcomes (Barr, 2013).

Successful implementation of translational models depends on using comprehensive and appropriate outcome assessments based on our theoretical frameworks. Our assessments must go beyond examining reactions and learning—they must include evaluating behaviors and results. Program examiners are increasingly using versions of Kirkpatrick and Kirkpatrick’s (2016) New World Kirkpatrick Model to evaluate IPE programs is on the rise (Nunez et al., 2020), suggesting an increase in the examination of IPE’s impact. Many tools for examining team performance are available and have been critically reviewed (see, e.g., Marlow et al., 2019). In their recommendations (and, indeed, in those of others), the IOM stresses the value of qualitative data and encourages mixed designs along with quantitative studies. Further, programs such as the Nexus Innovation Incubator network and the National Center Data Repository are collaborating with educational and health care systems to gather distal data that will enable researchers to study and advance IPECP (Nawal Lutfiyya et al., 2015).

Collaborate and Pool Resources

The challenges of translational research do not absolve academic and clinical programs from their responsibility to develop and implement IPECP impact research programs. To address potential obstacles, we need to pool resources and collaborate across institutions (Nunez et al., 2020).

  • Establish educational/externship partnerships: We should formalize bi-directional exchanges between academic programs and educational/health care preceptors should formalize to support educational continuity, educational relevance, patient/client/family involvement, and impact assessment. As discussed by many, the conceptual space of IPE and CP cannot be separated.
  • Institute cross-institutional collaborations: When institutions partner, they can identify and leverage one another’s expertise, more easily observe trends, and enhance and sustain educational and research efforts. Many examples of successful institutional IPE collaborations exist across the United States. To ensure a diversity of professions, to tap expertise, and to share resources, Texas Christian University has collaborated with area institutions to develop, conduct, and assess interprofessional experiences. The Texas IPE Consortium “foster[s] cross-institutional collaboration in order to expand learning opportunities and reinforce value for IPE as a critical aspect of health professions education”and has more than 28 institutional participants (Texas IPE Consortium, n.d.). Now, more than ever, we need to collectively address the call for impact research. This is not simple work and requires teams of experts to support, for example, the many proposed methodologies for IPECP research (e.g., comparative effectiveness research, economic analysis). Although credentialing standards across professions have expanded to include effective collaboration, resources typically have not. 
  • Connect with “big data” efforts: Staying connected with organizations and programs that promote collaboration to generate shareable information, such as the National Center for Interprofessional Practice and Education’s IPE Knowledge Generation program (White Delaney et al., 2020), will allow us to use big data science in understanding outcomes.
  • Increase professional connections with policymakers: Increased involvement in accreditation, certification/licensure, education, and health care committees and boards will enhance an understanding of systems when examining IPECP outcomes.


American Speech-Language-Hearing Association. (2019). All together now: IPE/IPP approaches in academic and clinical education. The ASHA Leader, 24(10), p. 34.

Barr, H. (2013). Toward a theoretical framework for interprofessional education. Journal of Interprofessional Care, 27(1), 4–9.

Hewitt, G., Sims, S., & Harris, R. (2012). The realist approach to evaluation research: An introduction. International Journal of Therapy and Rehabilitation, 19(5), 250–259.

Institute of Medicine. (2015). Measuring the impact of interprofessional education on collaborative practice and patient outcomes. Washington, DC: The National Academies Press.

Khalili, H., Thistlethwaite, J., El-Awaisi, A., Pfeifle, A., Gilbert, J., Lising, D., MacMillan, K., Maxwell, B., Grymonpre, R., Rodrigues, F., Snyman, S., & Xyrichis, A. (2019). Guidance on global interprofessional education and collaborative practice research: Discussion paper.

Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick’s Four Levels of Training Evaluation Alexandria, VA: ATD Press.

Marlow, S. L., Lacerenza, C., Iwig, C., & Salas, E. (2019). Assessing health care team performance: A review of tools and the evidence supporting their use.

Musaji, I., Self, T., Marble-Flint, K., & Kanade, A. (2019). Moving from interprofessional education toward interprofessional practice: Bridging the translation gap. Perspectives of the ASHA Special Interest Groups, 4(5), 971–976.

Nawal Lutfiyya, M., Brandt, B., Delaney, C., Pechacek, J., & Cerra, F. (2015). Setting a research agenda for interprofessional education and collaborative practice in the context of United States health system reform. Journal of Interprofessional Care, 30(1), 7–14.

Nunez, L., Mauldin, M., Pfeifle, A., Bridges, D., Jensen, G., & Nickol, D. (2020). IP Collaborative Scholarship Part I: Fundamentals and a Framework for Collaborations. Seminar presented at Nexus Summit, August 2020.

Reeves, S., & Hean, S. (2013). Why we need theory to help us better understand the nature of interprofessional education, practice, and care. Journal of Interprofessional Care, 27(1), 1–3.

Reeves, S., Palaganas, J., & Zierler, B. (2017, Spring). An updated synthesis of review evidence of interprofessional education. Journal of Allied Health, 46(1), 56–61.

Salas, E., Zajac, S., & Marlow, S. L. (2018). Transforming health care one team at a time: Ten observations and the trail ahead. Group & Organization Management, 43(3), 357–381.

Tannenbaum, S., & Salas, E. (2021). Teams that work: The seven drivers of team effectiveness. Oxford University Press.

Texas IPE Consortium. (n.d.). Welcome to the Texas Interprofessional Education (IPE) Consortium.

White Delaney, C., AbuSalah, A., Yeazel, M., Stumpf Kertz, J., Pejsa, L., & Brandt, B. F. (2020). National Center for Interprofessional Practice and Education IPE core data set and information exchange for knowledge generation. Journal of Interprofessional Care. Advance Online Publication.

World Health Organization. (2010). Framework for action on interprofessional education and collaborative practice.