Categories
education physiotherapy

Stop complaining about the “knowledge-practice gap”

The “knowledge-practice gap” is a well known problem in health professions education and an enormous amount of time is spent complaining about how difficult it is to narrow the gap. The truth is, the knowledge-practice gap is a problem of our own making, and the name we’ve given this problem hints at the answer.

We’ve set it up so that there is a tension between what happens in the classroom (acquire knowledge) and what is supposed to happen in practice (use knowledge). Or, to be more specific, there is a tension between how students think and behave in the classroom and how we want them to think and behave in the clinical context. This is the “gap” that we’re always talking about bridging; the difference between the knowledge that students acquire in the classroom, and the practical application of that knowledge in clinical practice.

However, instead of treating the problem as something natural to be overcome (“this is just the way it is”), we can just accept that the reason the gap exists is simply because what most of what we expect students to do in the classroom is not a practice at all. We set up a situation where we create different contexts for knowledge acquired and knowledge applied and then complain when students struggle to move between the different contexts.

The truth is that we already have good evidence to suggest alternative ways of thinking about the “different contexts” problem, and we know what to do about it. Situated cognition is a learning theory that proposes that:

“…knowledge is situated, being in part a product of the activity, context, and culture in which it is developed and used.”

In other words, knowledge must be acquired in similar contexts to the ones in which it must be used. If you think about the classroom context, what ways of thinking and being are students required to practice? Are they required to practice at all? In order to satisfy most physiotherapy educators, our students simply need to show up, sit down and listen. Even if we assume that they are able to construct knowledge in some meaningful way from this traditional approach to learning (generally speaking, they are not), how does this practice enable them to apply what they learn in classroom to the clinical context? Simply put, it doesn’t. The reality is that the knowledge-practice gap exists because of the way we teach.

In order to address the problem of the knowledge-practice gap we need to accept that students’ ways of thinking and being in the classroom must be similar to the ways of thinking and being we expect in the clinical context. We must therefore give students learning tasks in the classroom that require them to think and behave in the same way as we expect them to think and behave while on clinical rotation. The classroom practice and the clinical practice must therefore be similar. Seen from this perspective, there would be no knowledge-practice gap because there would be no difference in the contexts in which knowledge is acquired and how it is used.

So, how do we create a classroom context where students are expected to think and behave in ways that are similar to how we expect them to think and behave in the clinical context? I think that Authentic learning is a good place to start. It’s a teaching framework that operationalises situated cognition. In other words, it’s a way of thinking about learning task design that includes attributes that would cause students to think and behave in one context that would help develop those processes for other contexts. I’ve written some notes on Authentic learning before, so won’t go into detail here, other than to share the characteristics of authentic learning, which are that tasks:

  • Should have real-world relevance i.e. they match real-world tasks
  • Are ill-defined (students must define tasks and sub-tasks in order to complete the activity) i.e. there are multiple interpretations of both the problem and the solution
  • Are complex and must be explored over a sustained period of time i.e. days, weeks and months, rather than minutes or hours
  • Provide opportunities to examine the task from different perspectives, using a variety of resources i.e. there isn’t a single answer that is the “best” one. Multiple resources requires that students differentiate between relevant / irrelevant information
  • Provide opportunities to collaborate should be inherent i.e. are integral to the task
  • Provide opportunities to reflect i.e. students must be able to make choices and reflect on those choices
  • Must be integrated and applied across different subject areas and lead beyond domain-specific outcomes i.e. they encourage interdisciplinary perspectives and enable diverse roles and expertise
  • Seamlessly integrated with assessment i.e. the assessment tasks reflect real-world assessment, rather than separate assessment removed from the task
  • Result in a finished product, rather than as preparation for something else
  • Allow for competing solutions and diversity of outcome i.e. the outcomes can have multiple solutions that are original, rather than a single “correct” response

Looking at the above list it should be easy to see how tasks designed with these characteristics in mind would be similar to the ways we would think about successful clinical practice. In other words, you could see how students who could successfully solve problems designed with this framework might also be able to solve clinical problems. The tasks we give them in the classroom would require them to think and behave in ways that we expect them to think and behave in clinical practice. No more knowledge-practice gap?

References

Categories
assessment curriculum physiotherapy teaching

Introducing the OSPE format to physiotherapy practicals

Schematic for student movement through the stations

Last year at our planning meeting (every year we meet to review the year and to plan for the upcoming one) we committed to conducting all of our practicals from now on in the OSPE format (Objective Structured Practical Examination). This format has the advantages of having all students perform the same assessment tasks, as well as having each student assessed by every examiner. There are other advantages (and disadvantages) but there’s plenty of literature that discusses it more eloquently than I have time to do here.

 

We’ve been running all of our practical tests in the department using this format since we made the decision last year and after a few bumps, we’re starting to get it right. We now run two tracks in parallel, so that we can see twice the number of students in the same time. We were limited to 5 examiners in our first test. There were some other problems that it took a few tests to sort out:

  • We didn’t always choose appropriate techniques for the time limit at each station e.g. some techniques ended up being completed way sooner than the time allocated, and others were rushed
  • We allotted too much time to move between stations
  • We read the same instructions to every candidate, which wasted time
  • We only realised during the second OSPE that students who were still waiting to take the test still had their cellphones with them

We surveyed the students and staff following the first OSPE and are in the process of reviewing those responses. We knew that we’d get a few things wrong no matter how much we tried and so the survey was an attempt to highlight areas that we wouldn’t necessarily have thought of by ourselves.

We’re going to use Google Docs to collaboratively write up an article based on the student and staff responses, just to highlight the challenges of moving to and running an OSPE in a resource-constrained environment. I’ll follow up this post with the outcome of the article.

If you’ve been through the process of introducing the OSPE format into your assessments, I’d love to hear about the challenges and successes you had.

Categories
assessment curriculum education physiotherapy students teaching

Teaching a practical Movement Science class

This is the first year that I’ve taught Movement Science (i.e. analysis of movement), which was daunting at first as I wasn’t familiar with it. In addition, the module content was almost entirely in hardcopy, so I’ve been typing away like crazy to get it all into a digital format. The practical component of the module has been both challenging and rewarding. In the past we’d demonstrate a technique or analyse a movement for the students and then ask them to break into small groups and do it for themselves. This year I’ve been trying to do it a little differently.

I begin with a very short lecture identifying the key concepts that will be useful during the practical session. For example, if we’re going to do gait analysis I review the normal gait cycle as well as discuss some of the ways that gait might be compromised in a patient with neurological dysfunction. Then I ask them to do the analysis in small groups but without a demonstration. I explain that I don’t have any expectation that they’ll be able to do it at the appropriate level but that they should try anyway. Each student must do the analysis (otherwise some will passively observe) and each student must model the movement sequence (so that they can all be aware of how movement occurs, as well as demonstrate that each person’s “normal” is actually different).

During the practical I move between the groups, addressing any questions they have and at the same time, getting a feel for their differing levels of understanding. I spend 10-15 minutes with each group, going back to the basic concepts that I presented in the lecture and then using simple movements to do the analysis. Often the students have moved through the different movement sequences quite quickly, having not paid attention to the details and just wanting to “tick off” what they’ve done. When I get to them and we start again, they quickly realise how much they’ve left out.

I’ve found that the students are thinking more deeply about what they’re doing than if I demonstrated a technique at the beginning and asked them to merely copy it. This way, they’re having to figure things out for themselves with only a basic framework to work with. Once they’ve struggled with the analysis for a few minutes, they’ve had the opportunity to work out what they don’t know. Then, when I move around to their group, they not only have several questions but have already tried a few different approaches.

When I’ve worked through all the groups I have an idea of the main concepts that need to be reviewed and elaborated on and can end the session with a practical demonstration highlighting what most of the students were struggling with.

Some of the lessons students leave with, besides the module-specific content:

  • Working the answer out by yourself can be rewarding
  • Trying (and sometimes failing) doesn’t mean that you’re stupid, it’s a valid way to learn
  • Asking questions isn’t a weakness