By now, you’ve likely heard about efforts to decrease the research-to-practice gap in communication sciences and disorders (CSD) through implementation science. Tools and techniques of implementation science aim to improve client outcomes by systematically attending to factors that influence the translation of research into real-life settings (e.g., the individual provider, the organizational context, the assessment or treatment itself). Implementation science theories, models, and frameworks allow us to (a) better guide the process of getting research into practice, (b) explain what might contribute to implementation outcomes of interest, and (c) evaluate our implementation efforts. Incorporating implementation science into our research methods may also contribute to health equity by prioritizing the needs of the patient and by partnering with students, clients, families, clinicians, organizational leaders, and other relevant parties.
Applying Implementation Science to Educational Practices
There is recent interest in considering the role that implementation science might play in our educational practices. Specifically, how do we take scholarship of teaching and learning (SoTL) research and apply it more broadly across courses, programs, and institutions? A recent workshop from the National Academies of Science, Engineering, and Medicine explored the intersection of health professions education and implementation science. If we consider evidence-based pedagogy—such as problem-based learning, service-learning, study-abroad experiences, and interprofessional education—how might implementation science aid in understanding what is needed to utilize these approaches more broadly in order to improve our student and future clinician outcomes?
An Example
The language of implementation science and the SoTL alone can be complicated, but Geoffrey Curran invites us to make implementation science too simple. So, we offer this example to get us started and to help us think through one potential way to bridge the gap between SoTL research and classroom implementation. If we implement “the thing” or the evidence-based pedagogical approach of problem-based learning, effectiveness research would tell us if “the thing” works. If problem-based learning works, then, after participating in a problem-based learning environment, our students would demonstrate improvements in critical thinking, problem-solving abilities, and/or communication skills—at least, within the context of the specific students, course content, and setting in which the research took place. However, effectiveness research by itself does not assist us in spreading “the thing” consistently within and across real-world classrooms. This is where implementation science comes in and can help us. Implementation research looks at the best ways to help instructors, programs, and institutions to “do the thing.” It takes into account implementation strategies and outcome measures. If we continue with our problem-based learning example, then our research questions might be the following: How might we best help instructors use problem-based learning in their classrooms, programs, or institutions? What do we need to do to help them incorporate this evidence-based pedagogical approach to teaching so that it has a positive impact on student learning outcome measures? This may mean helping educators to think about what they are doing before, during, and after the problem-based learning class period or course or encouraging educators to think about which aspects of their teaching or educational setting might be interfering with the use of this pedagogical approach. Therefore, implementation science prompts us to ask questions such as . . .
- What are the key ingredients to problem-based learning?
- What do educators need to do, or what can they leave behind or modify?
- How much or how little of problem-based learning is required to meet the desired student learning outcomes?
Implementation strategies are the “stuff we do” to help people “do the thing.” We could think about aspects of teaching and learning such as making adaptations to the local context, changing the infrastructure, providing technical assistance, and identifying early adopters that might help us implement problem-based learning.
Implementation outcomes are related to “how much and how well people do the thing.” What percentage of our faculty are consistently implementing problem-based learning with fidelity? Or, similarly, do you have evidence of problem-based learning being implemented across your curriculum—and, if so, to what degree? Are certain courses or instructors more amenable to using problem-based learning? Why or why not?
Key Implementation Outcomes
A particular focus on implementation outcomes such as fidelity, feasibility, and sustainability can help us bridge the research-to-practice gap in SoTL. Applying this to our example (i.e., the implementation of problem-based learning), we would need to think about whether problem-based learning can be faithfully and consistently implemented. If so, then, how easy or difficult is it to incorporate problem-based learning into the teaching environment? Is this something that is doable for a typical faculty member, program, and/or institution? If it is doable, then, can it be maintained or continued over time? What resources or supports would need to be in place in order to maintain the use of problem-based learning? These questions get at the variables of fidelity, feasibility, and sustainability—and providing attention to these implementation outcomes at the beginning of, or even before, the implementation of a practice like problem-based learning can be of great value.
A Call to Action
Implementation science is already spreading within the clinical fields that fall under the CSD discipline. It is now time to adapt implementation science research and to systematically examine implementing SoTL into our college classrooms. This is a call to use what SoTL research tells us about effective approaches to improving student learning outcomes—and to examine what it takes to “do the thing” and then implement these evidence-based pedagogical approaches across faculty members, courses, programs, and institutions. We need to determine what resources are necessary by examining the appropriateness, cost, feasibility, fidelity, and sustainability of implementing evidence-based pedagogical approaches more broadly. However, it takes all of us— the SoTL researchers and the higher education faculty, programs, and institutions—to “do the thing.” We have the opportunity to examine the “stuff we do” and “how much and how well people do the thing.” And, most important, we need to share and disseminate our results with each other to improve teaching and learning outcomes as we train the next generation of clinicians within our discipline.