learning PhD physiotherapy research teaching

Workshop on facilitation techniques using the Conversational Framework

How do we get students to think more deeply about learning in an academic context?

I’m giving a workshop later today. The idea is that we’ll get all of the facilitators who’ll be working on the module we’re designing (and which I’m evaluating for my PhD) and help them get a grip on the approach to facilitation that we’d like them to use. The objective of the workshop is to help them get an understanding of the conceptual basis for facilitation in this module. We’re going to use Laurillard’s “Conversational Framework” as a structure to guide how the facilitators should try and engage with their groups, both in the classroom and in the clinical context. The following notes have been taken from Laurillard’s “Rethinking University Teaching“.

Learning needs to be situated within a context and we can’t separate the knowledge to be learned from the context in which it has to be applied. Conceptual knowledge is not an abstract, intangible thing. It is a tool that can be used as part of an authentic learning activity. There is a unity between the problem, context and solution when the problem is experienced, that is absent when an answer is merely given.

Teaching is essentially an activity that tries to help students change the way the see the world by interpreting the insights of others.

  • “Everyday learning” = a result of our experiences in the world i.e. we develop an implicit awareness of gravity by falling
  • “Academic learning” = a result of our reflections on others’ descriptions of the world i.e. we develop an understanding of a theory of gravity by reading about experiments conducted by other people

Academic learning is different to everyday learning in the sense that it is the student learning through interpreting the symbols (i.e. language, images, diagrams) of someone else’s view of the world

The knowledge that students bring with them will impact on how they integrate the new knowledge that they learn. Remember the ZPD and how the MKO guides the student to higher cognitive levels by building on what they already know.

It makes no sense to correct a faulty procedure without also correcting the faulty conceptualisation that supported it (knowledge is situated in action, and action manifests knowledge). Correcting fundamental misconceptions automatically corrects all of the faulty procedures associated with it. Correcting the procedure corrects only one way of doing it incorrectly. This is one problem with merely demonstrating a technique. The student is forced to conceive a rationale for the technique, which may be incorrect. By taking them through an experience of solving a problem, the rationale for the technique is implicitly tied to its performance.

Before we can challenge the students’ fundamental misconceptions, we need to know what those misconceptions are. Again, this links back to the ZPD. Without knowing where the student is, we cannot help them get to where they want to be.

Researching the learning process (which is essentially what a facilitator is…a dynamic researcher into student learning) should include an observation of student performance on a task e.g. worked problems or written explanations, with a retrospective interview of the student looking back at the task and describing how they experienced it. The interviewer uses the task to provide cues to the student.

The learning process includes 5 interdependent aspects:

  • Apprehending structure. Students often fail to apprehend the structure of a discourse (e.g. a body of text), and there is often meaning that is implicit in structure (e.g. headings, paragraphs, etc.). When students take a surface approach to studying a text they lose the structure of the arguments and end with a series of statements that are not related to each other. When they take a deep approach they preserve the structure was well as the original meaning.
  • Integrating parts. Students must learn how to interpret the discipline-specific representations if they are to make sense of them. The way that information is presented can lend itself to deep or surface approaches, as well as create potential “distractors” for the student. The idea is not to ensure that data representation is “easy” for the student to interpret but rather to prepare the student to handle the different representations. Complex scenarios provide opportunities to determine students’ ability to interpret the representations. For example, consider how students are confused when different clinicians advocate different management approaches for the same patient. The student who only comprehends the superficial structure of the interaction is stuck because they cannot perceive that interpretations can be different.
  • Acting on the world. Learning is an activity (classroom-based problem-solving), an imitation of practice (practical sessions in the classroom), or actual practice (seeing patients). The student must engage with the world (i.e. solving problems in the classroom, or treating patients) by performing an action that is based on their understanding of how the world works.
  • Using feedback. As we learn about the world by acting on it, we receive direct feedback and adjust the action in relation to the feedback. The feedback must be perceived as useful to the student (i.e. it must be meaningful). It must be given immediately (or soon) after the students’ action in order for the student to relate the feedback to the action. Helpful feedback also provides the student with specific information on how to adapt their performance.
  • Reflecting on goals. Reflection is about establishing conceptual links between the action, feedback, and integration of the two as they relate to the achievement of a goal (e.g. solving a problem). Students often interpret goals as being something required by the teacher and go through the steps necessary to reproduce an outcome, with little intention of understanding the task or the goal (i.e. the tasks are a series of hoops that they have to jump through). The same task is therefore perceived differently by the students and teacher, and therefore operationalised in different ways. For many students, what it means to achieve the objective / goal is different to what the teacher is trying to do.

Using the above steps, we can see how learning something deeply is complex and difficult to facilitate. In short, the facilitator should try to conduct an interactive dialogue that supports the learning process. The following points describe the components of a teacher-student dialogue that promotes deep learning of a topic.

Apprehending structure

  • Students role: look for structure, discern topic goal (if the goal isn’t explicitly identified, the student lacks the structure to guide their thinking), relate goal to structure of discourse
  • Facilitators role: explain phenomena, clarify structure, negotiate topic goal, ask about internal relations (explain phenomena, make predictions, compare analogous situations)

Interpreting forms of representation

  • Students role: model events / systems in terms of forms of representation, interpret forms of representation to model systems / events
  • Facilitators role: set mapping tasks between forms of representation and systems / events, relate forms of representation to students’ view

Acting on descriptions

  • Students role: derive implications, solve problems, and test hypotheses to produce descriptions
  • Facilitators role: elicit descriptions, compare descriptions, highlight inconsistencies

Using feedback

  • Students role: link teachers redescription to relation between action and goal, to produce new action on description (student gives a description of something, teacher responds with a different viewpoint that demonstrates inconsistency, student must therefore reframe / describe it again)
  • Facilitators role: provide redescription, elicit new description, support linking process

Reflecting on goal-action-feedback cycle

  • Students role: engage with goal, relate to actions and feedback (this is why the goal of the dialogue must be explicit, to allow students to reflect its relationship to the action / description and feedback)
  • Facilitators role: prompt reflection, support reflection on goal-action-feedback cycle

There should be a continuing, iterative dialogue between teacher and student, that reveals both parties conceptions and differences between the conceptions, which then determines the focus for continuing dialogue. However, it’s not just the process of conducting the dialogue that matters but HOW it is conducted e.g there must be an opportunity for the student to interpret forms of representation other than language.

A teaching strategy should be:

  • Discursive – the teachers and students conceptions should be continually accessible to each other; teacher and student must agree on the learning goals for the topic; the teacher must provide an environment for the discussion, within which the student can generate and receive feedback on descriptions appropriate to the topic goal; the teachers description must be meaningful to the student
  • Adaptive – the relationship between the teacher’s and student’s conceptions must serve as the focus for the continuing dialogue; it is the student’s responsibility to use the feedback from their work on the task and relate it to their conception
  • Interactive – the teacher must provide an environment in which the student can act on, generate and receive intrinsic feedback on actions appropriate to the task goal; the student must act to achieve the task goal; the teacher must provide meaningful feedback on their actions that relates to the nature of the task goal
  • Reflective – the teacher must support the process in which students link the feedback on their actions to the topic goal for every level of description within the topic structure; the student must reflect on the task goal, their action on it, and the intrinsic feedback they receive, and link this to their description of their conception to the topic goal
assessment education PhD physiotherapy teaching

Challenging students’ conceptual relationships in clinical education

I just wanted to share a thought while preparing our case notes for the Applied Physiotherapy module we’re developing. One of the designers made a note of the “guideline answers” for facilitators to some of the questions that we might use to trigger students’ thinking. I wrote the following as a comment and didn’t want to lose it when the document is finalised, so I’m putting it here.

“I think we should make sure that, in addition to the ‘answers’, we should identify the main concepts we want students to understand. Remember that we’re using our paper patient (i.e. the case) as a framework for students to learn about concepts. Then, they apply those concepts in the real world to patients. They reflect on those real-world interactions and identify dissonance between their experienced reality (the patient contact) and their abstract conceptions of reality (how they originally conceived of the patient contact). After the patient contact, they feed back to their small groups and facilitators, who together help students create new relationships between concepts. So, in short, the clinical concepts are learned initially through the paper patient, tested in the real world with an actual patient, discussed online (maybe) and then brought back to the classroom for further reflection and refinement. The next week they are exposed to new concepts that build on their previous experiences, and then they get to test those abstractions in the real world again.”

I’m trying to take an intentional approach to using Laurillard’s conception of academic learning that I’m exploring in “Rethinking University Teaching”

learning PhD physiotherapy research teaching

From “designing teaching” to “evaluating learning”

Later this month we’ll be implementing a blended approach to teaching and learning in one module in our physiotherapy department. This was to form the main part of my research project, looking at the use of technology enhanced teaching and learning in clinical education. The idea was that I’d look at the process of developing and implementing a blended teaching strategy that integrated an online component, and which would be based on a series of smaller research projects I’ve been working on.

I was quite happy with this until I had a conversation with a colleague, who asked how I planned on determining whether or not the new teaching strategy had actually worked. This threw me a little bit. I thought that I had it figured out…do small research projects to develop understanding of the students and the teaching / learning environment, use those results to inform the development of an intervention, implement the intervention and evaluate the process. Simple, right?

Then why haven’t I been able to shake the feeling that something was missing? I thought that I’d use a combination of outputs or “products of learning” (e.g. student reflective diaries, concept mapping assignments, semi-structured interviews, test results, focus groups, etc.) to evaluate my process and make a recommendation about whether others should consider taking a blended approach to clinical education. I’ve since begun to wonder if that method goes far enough in making a contribution to the field, and if there isn’t something more that I should be doing (my supervisor is convinced that I’ve got enough without having to change my plan at this late stage, and she may be right).

However, when I finally got around to reading Laurillard’s “Rethinking University Teaching”, I was quite taken with her suggested approach. It’s been quite an eye opener, not only in terms of articulating some of the problems that I see in clinical practice with our students, but also helping me to realize the difference between designing teaching activities (which is what I’ve been concentrating on), and evaluating learning (which I’ve ignored because this is hard to do). I also realized that, contrary to a good scientific approach, I didn’t have a working hypothesis, and was essentially just going to describe something without any idea of what would happen. Incidentally, there’s nothing wrong with descriptive research to evaluate a process, but if I can’t also describe the change in learning, isn’t that limiting the study?

I’m now wondering if, in addition to what I’d already planned, I need to conduct interviews with students using the phenomenological approach suggested by Laurillard i.e. the Conversational Framework. I don’t yet have a great understanding of it but I’m starting to see how merely aligning a curriculum can’t in itself make any assertions about changes in student learning. I need to be able to say that a blended approach does / does not appear to fundamentally change how students’ construct meaning and in order to do so I’m thinking of doing the following:

  • Interview 2nd year and 3rd students at the very beginning of the module (January, 2012), before they’ve been introduced to case-based learning. My hypothesis is that they’ll display quite superficial mental constructs in terms of their clinical problem-solving ability as neither group has had much experience with patient contact
  • Interview both groups again in 6 months and evaluate whether or not there constructs have changed. At this point, the 2nd years will have been through 6 months of a blended approach, while the 3rd years will have had one full term of clinical contact with patients. My hypothesis is that the 2nd years will be better able to reason their way through problems, even though the 3rd years will have had more time on clinical rotation

I hope that this will allow me to make a stronger statement about the impact of a blended approach to teaching and learning in clinical education, and to be able to demonstrate that it fundamentally changes students constructs from superficial to deep understanding. I’m just not sure if the Conversational Framework is the most appropriate model to evaluate students’ problem-solving ability, as it was initially designed to evaluate multimedia tools.