Theoretical underpinnings of PBL

Schmidt, H.G. (1993). Foundations of problem-based learning: Some explanatory notes. Medical Education, 27: 422-432.

This paper presents the theoretical premises that underlie decisions to use PBL as part of an approach to develop critical thinking. A key premise is that knowledge cannot be transferred, as in a lecture. The learner has to “master it”. An important aspect of learning is that the topic being studied must actually be understood.

The paper identifies 6 fundamental principles of learning derived from cognitive neuroscience and educational psychology, which show how this understanding can be achieved.

  1. Prior knowledge of a subject is the most important determinant of the nature and amount of new information that can be processed. The amount of prior knowledge available determines the extent to which something new can be learned. Cases must therefore be iterative i.e. they must build on what came before. This is either in the form of sections within cases, where each new section is scaffolded onto what came before, or in the form of cases that build on previous cases.
  2. The availability of relevant prior knowledge is a necessary condition for understanding and remembering new information. This prior knowledge needs to be activated by cues in the context in which the current information is being studied. Cases must therefore offer cues that cause prior knowledge to be activated. This is the reason for the “What do I already know”? question.
  3. Knowledge is structured and the way in which it is structured in memory makes it more or less accessible. “Knowledge” is stored as a relationship between two or more concepts. This is known as a semantic network and it allows us to impose a structure on what would otherwise be an undifferentiated mass of isolated facts. Facilitators should therefore always ask students to represent their knowledge and understanding as a relationship between concepts e.g. “This is the way it is because of the way this other thing is”. The way that students present their understanding allows us to “see” their misunderstanding, which helps facilitators guide them towards a more accurate knowledge structure. Semantic networks are therefore not necessarily accurate representations of reality but they provide a means by which we can understand the world. The depth and accuracy of this understanding is a function of the quality of the semantic networks we have. We should therefore not think of the semantic network as “book knowledge” i.e. a set of facts. Rather, it is a reflection of a person’s experiences, views and ideas. One of the problems in the clinical environment is that knowledge learned in the classroom (“book knowledge” -> isolated facts) is that there are few opportunities for students to develop the semantic network that establishes relationships between concepts / facts.
  4. Storing information in memory and retrieving it can be improved when, during learning, elaboration of the material takes place. “Elaboration” is to actively establish and expand on the relationship between concepts. This process creates multiple “retrieval paths” to understanding. The more “paths” exist, the more likely it is that a concept will be retrieved. Facilitators should therefore aim to question students’ on their reasoning behind articulated statements e.g. “Explain to me why you think that technique is appropriate for this patient”?
  5. The ability to activate knowledge in long-term memory and make it available for use depends on contextual cues. Learning about topics in the context in which they are likely to be needed increases knowledge retention. Information that is intentionally learned and incidental information about context are simultaneously stored in memory. This is called the “contextual dependency of learning”. Cases and their discussion should therefore be conducted using the language and culture of the profession as tools to guide and scaffold the process. Students must present findings and articulate understanding to each other and facilitators as if they were on the ward.
  6. To be motivated to learn, prolongs the amount of processing time put in, and therefore improves achievement. In other words, someone who feels the urge to learn will be better prepared to spend more time on learning than someone who feels less inclined. Facilitators should therefore spend time developing students’ curiosity and motivation to learn, as part of a general approach to developing lifelong learners. Avoid simply “getting the task done”. Rather try to get students to develop an active and focused curiosity on the topic. When group work is aimed at stimulating interest and engagement, students are more likely to follow up with their own research. Group discussion aimed at clarifying one’s own point of view and being confronted with other perspectives stimulates focused curiosity.

Problem-based learning is therefore an approach to teaching and learning where students work together in small groups, with guidance from a facilitator, and try to solve problems (in our context, clinical cases).

In order to activate prior knowledge, the clinical problem must first be discussed by the students without them having “the answers”, or reference to the literature. The goals of this preliminary discussion are:

  1. To mobilise the knowledge that they already have available.
  2. To help elaborate on that knowledge i.e. to establish conceptual relationships.
  3. To contextualise the available knowledge within the current case.
  4. To engage the students’ curiosity to find out more.

It is clear therefore, that students should not have the facts given to them (e.g. with a lecture) prior to discussion of the case in their small groups. They should also be discouraged from simply identifying and allocating research questions to cover individually. The discussion is an essential aspect of the process, based on established theories of learning that aim to drive understanding.

If the clinical case is well-designed, students will begin to identify areas where they lack understanding and will soon begin to ask questions that need answers before they can proceed. The questions that are derived after the discussion will help them to find information that will therefore build on prior knowledge.

Upon returning to their groups with the new information, obtained by finding answers to questions derived in the first discussion, students then share and discuss their new information, which helps to structure the new knowledge in new semantic networks. Central to this process is the idea that while thinking, studying and talking about the clinical case, students are building a context-sensitive cognitive structure, which may help them to understand more complex clinical problems that they encounter later.

Conclusion The problem-based approach to teaching and learning is premised on 6 fundamental theories of learning that are derived from cognitive neuroscience and educational psychology. These principles include:

  1. Activating prior knowledge (basing current tasks on previous ones)
  2. Elaborating prior knowledge (forming relationships between concepts)
  3. Discussing problems in small groups (constructing semantic networks)
  4. Designing problems that are contextually relevant (solve problems in classroom that are cognitively similar to other spaces e.g. clinical)
  5. Fostering curiosity (so that students are internally motivated to spend more time on tasks)

Adding complexity for its own sake

saraceno15I was discussing a PhD project with a colleague at the HELTASA conference a few weeks ago and she was describing her plan to me. She’s interested in the possibilities that mobile technology brings to higher learning, specifically in nursing education. I gathered that she was talking about mobile as a combination of hardware and software as a means of accessing content, although we didn’t really get into how she was defining mobile for her study.

What I found most interesting was that she was starting from the point that she would be using mobile, and then looking for a problem that she could use it to solve. This seems to be the wrong way around.

We often find people wanting to add complexity (e.g. using mobile devices in the clinical context) without really thinking about whether that added complexity brings with it any benefits. And then asking if the cost of the added complexity brings about a greater benefit. Before adding anything to the curriculum we need to ask ourselves, “What are we going to get in return?”

My colleague wanted to use mobile devices to figure out students’ prior knowledge i.e. she began from the premise that she would be using mobile devices. When I asked her why she didn’t just use pen and paper, she was confused. She said that she couldn’t use pen and paper because she would be using mobile devices. And therein lies the problem. She didn’t say that she wanted students to collaboratively come up with a dataset of “prior knowledge”, or that she wanted all students to see each others’ work, or any other reason that digital or mobile would have an advantage. Her sole reason for wanting to use mobile is that she wanted to use mobile.

By adding complexity to the curriculum without conducting a cost/benefit analysis, you will most likely include a set of unintended consequences, like increasing the actual financial cost of the course, increasing the workload of teachers, or confusing students. Without having a definite objective in mind, which would be enhanced or otherwise facilitated through the addition of the new feature, it’s difficult to argue convincingly for its inclusion.

Content isn’t important, relative to thinking

I just had a brief conversation with a colleague on the nature of the teaching method we’re using in my department. Earlier this year we shifted from a methodology premised on lectures, to the use of case-based learning. I’ve been saying for a while that content is not important, but I’ve realised that I haven’t been adding the most important part, which is that content is not important, relative to thinking.

Of course content is important, but we often forget why it’s important. Content doesn’t help students to manage patients (not much anyway). The example I often use is that a student can know many facts about TB, including, for example, its pathology. But, that won’t necessarily help them to manage a patient who has decreased air entry because of the TB.

What will help the student is the ability to link data obtained from the medical folder, patient interview and physical exam, with the patients signs and symptoms. By establishing relationships between those variables, the student develops an understanding of how to proceed with the patient management process, which includes treatment. There is very little content that the student needs in order to establish those relationships. In those situations, what the content does focus on is a recipe list of commonly used assessment and treatment interventions, which the student can memorise and apply to a patient who presents in a certain way. This is NOT what we want though. This approach doesn’t help students’ adapt and respond to changing conditions.

Knowing the pathology of TB may tell the student WHY there is decreased air entry to the basal aspect of the lungs, but not WHAT TO DO about it (unless you want students to follow recipes). Clinical reasoning is the important part, not content. This is what I’ve been missing when I tell people that content isn’t important. It’s not, but only relative to thinking.

Posted to Diigo 06/15/2012

    • we have only begun to understand the ways that the “social life of information” and the social construction of knowledge can reshape the ways we create learning experiences in the formal college curriculum
    • we define social pedagogies as design approaches for teaching and learning that engage students with what we might call an “authentic audience” (other than the teacher), where the representation of knowledge for an audience is absolutely central to the construction of knowledge in a course
    • social pedagogies strive to build a sense of intellectual community within the classroom and frequently connect students to communities outside the classroom
    • social pedagogies are particularly effective at developing traits of  “adaptive expertise,” which include the ability of the learner to use knowledge flexibly and fluently, to evaluate, filter and distill knowledge for effect, to translate knowledge to new situations, and to understand the limits and assumptions of one’s knowledge.
    • Equally as important is the cultivation of certain attitudes or dispositions characteristic of adaptive experts, including the ability to work with uncertainty, adapt to ambiguity or even failure, and to feel increasingly comfortable working at the edges of one’s competence
    • These kinds of adaptive traits—however valued they may be in the academy in the abstract—are often invisible and elusive in the course design and assessment process.  Designing a course that promotes, supports, and perhaps even evaluates these kinds of traits students implies that they have to be ways to make these effects visible—through some form of communication
    • Acts of representation are not merely vehicles to convey knowledge; they shape the very act of knowing
    • One of the salient research areas for higher education (and indeed other settings, such as organizational learning) is how to harness the effectiveness of informal learning in the formal curriculum.
    • Our understanding of learning has expanded at a rate that has far outpaced our conceptions of teaching. A growing appreciation for the porous boundaries between the classroom and life experience, along with the power of social learning, authentic audiences, and integrative contexts, has created not only promising changes in learning but also disruptive moments in teaching.
    • Our understanding of learning has expanded at a rate that has far outpaced our conceptions of teaching.
    • Christensen coined the phrase disruptive innovation to refer to a process “by which a product or service takes root initially in simple applications at the bottom of a market and then relentlessly moves ‘up market,’ eventually displacing established competitors.”
    • We might say that the formal curriculum is being pressured from two sides. On the one side is a growing body of data about the power of experiential learning in the co‑curriculum; and on the other side is the world of informal learning and the participatory culture of the Internet. Both of those pressures are reframing what we think of as the formal curriculum.
    • These pressures are disruptive because to this point we have funded and structured our institutions as if the formal curriculum were the center of learning
    • All of us in higher education need to ask ourselves: Can we continue to operate on the assumption that the formal curriculum is the center of the undergraduate experience?
    • higher education was in a powerful transition, moving from an instructional paradigm to a learning paradigm—from offering information to designing learning experiences, from thinking about inputs to focusing on outputs, from being an aggregation of separate activities to becoming an integrated design
    • our understanding of learning is expanding in ways that are at least partially incompatible with the structures of higher education institutions
    • these pressures for accountability are making us simultaneously more thoughtful and more limited in what we count as learning
    • The question that campus leaders need to address is how to reinvent a curriculum that lives in this new space
    • Technologies can play a key role here as new digital, learning, and analytics tools now make it possible to replicate some features of high‑impact activity inside classrooms, whether through the design of inquiry-based learning or through the ability to access and manipulate data, mount simulations, leverage “the crowd” for collaboration and social learning, or redesign when and how students can engage course content. Indeed, one of the most powerful aspects of today’s technologies is that many of the high‑impact features that used to be possible only in small classes can now be experienced not only at a larger scale but, in some cases, to better effect at larger scale.
    • A second response to the location problem of high-impact practices is to design for greater fluidity and connection between the formal curriculum and the experiential co-curriculum. An example is the use of e-portfolios, which allow students to organize learning around the learner rather than around courses or the curriculum.
    • “Drawing on the power of multimedia and personal narrative, recursive use of ePortfolio prompts students to expand their focus from individual courses to a broader educational process.”
    • The continued growth of e-portfolios across higher education reveals a restless search for ways to find coherence that transcends courses and the formal curriculum
    • A second pressure on the formal curriculum is the participatory culture of the web and the informal learning that it cultivates.
    • They looked at a range of web cultures, or participatory cultures, including Wikipedia, gaming environments, and grassroots organizations. They compiled a list of what they considered to be the shared and salient features of these powerful web-based communities:

      • Low barriers to entry
      • Strong support for sharing one’s contributions
      • Informal mentorship, from experienced to novice
      • A sense of connection to each other
      • A sense of ownership in what is being created
      • A strong collective sense that something is at stake
  • How many college classrooms or course experiences include this set of features? In how many courses do students feel a sense of community, a sense of mentorship, a sense of collective investment, a sense that what is being created matters?
  • Maybe that’s the intended role of the formal curriculum: to prepare students to have integrative experiences elsewhere
  • the typical school curriculum is built from content (“learning about”) leading to practice (“learning to be”), where the vast majority of useful knowledge is to be found. In a typical formal curriculum, students are first packed with knowledge, and if they stick with something long enough (i.e., major in a discipline), they eventually get to the point of engaging in practice. Brown argues that people instead learn best by “practicing the content.” That is, we start in practice, and practice drives us to content. Or, more likely, the optimal way to learn is reciprocally or spirally between practice and content.
  • Brown’s formulation echoes the growing body of inductive and inquiry-based learning research that has convincingly demonstrated increased learning gains, in certain well-designed conditions, when students are first “presented with a challenge and then learn what they need to know to address the challenge.”
  • how do we reverse the flow, or flip the curriculum, to ensure that practice is emphasized at least as early in the curriculum as content? How can students “learn to be,” through both the formal and the experiential curriculum?
  • In the learning paradigm, we are focusing not on the expert’s products but, rather, on the expert’s practice.
  • we help faculty analyze their teaching by slowing down and thinking about what it is that a student needs to do well in order to be successful with complex tasks
  • Which department is responsible for teaching students how to speak from a position of authority? Where do we find evidence of someone learning to speak from a position of authority? Which assessment rubric do we use for that? Critical thinking? Oral and written communication? Integrative learning? Lifelong learning? Of course, when faculty speak of “authority,” they mean not just volume, but the confidence that comes from critical thought and depth. Learning to “speak from a position of authority” is an idea rooted in expert practice. It is no more a “soft skill” than are the other dimensions of learning that we are coming to value explicitly and systematically as outcomes of higher education—dimensions such as making discerning judgments based on practical reasoning, acting reflectively, taking risks, engaging in civil if difficult discourse, and proceeding with confidence in the face of uncertainty.
  • Designing backward from those kinds of outcomes, we are compelled to imagine ways to ask students, early and often, to engage in the practice of thinking in a given domain, often in the context of messy problems.
  • What is the relationship between the intermediate activity and the stages of intellectual development or the constituent skills and dispositions of a discipline? What if the activities enabled by social media tools are key to helping students learn how to speak with authority?
  • If our concept of learning has outstripped our notion of teaching, how can we expand our notion of teaching—particularly from the perspective of instructional support and innovation?
  • In the traditional model of course design, a well-meaning instructor seeking to make a change in a course talks separately with the teaching center staff, with the technology staff, with the librarians, and with the writing center folks. Then, when the course is implemented, the instructor alone deals with the students in the course—except that the students are often going back for help with assignments to the technology staff, to the librarians, and to the writing center folks (although usually different people who know nothing of the instructor’s original intent). So they are completing the cycle, but in a completely disconnected way. Iannuzzi’s team‑based design thinks about all of these players from the beginning. One of the first changes in this model is that the instructor is no longer at the center. Instead, the course and student learning are at the center, surrounded by all of these other players at the table.
  • A key aspect of the team-based design is the move beyond individualistic approaches to course innovation. In higher education, we have long invested in the notion that the way to innovate is by converting faculty. This move represents a shift in strategy: instead of trying to change faculty so that they might change their courses, this model focuses on changing course structures so that faculty will be empowered and supported in an expanded approach to teaching as a result of teaching these courses.
  • we need to move beyond our old assumptions that it is primarily the students’ responsibility to integrate all the disparate parts of an undergraduate education. We must fully grasp that students will learn to integrate deeply and meaningfully only insofar as we design a curriculum that cultivates that; and designing such a curriculum requires that we similarly plan, strategize and execute integratively across the boundaries within our institutions.
  • we need to think more about how to move beyond the individualistic faculty change model. We need to get involved in team-design and implementation models on our campuses, and we need to consider that doing so could fundamentally change the ways that the burdens of innovation are often placed solely on the shoulders of faculty (whose lives are largely already overdetermined) as well as how certain academic support staff (e.g., IT organizations, student affairs, librarians) think of their professional identities and their engagement with the “curriculum.”
    • Thomson Reuters assigns most journals a yearly Impact Factor (IF), which is defined as the mean citation rate during that year of the papers published in that journal during the previous 2 years.
    • Jobs, grants, prestige, and career advancement are all partially based on an admittedly flawed concept
    • Impact factors were developed in the early 20th century to help American university libraries with their journal purchasing decisions. As intended, IFs deeply affected the journal circulation and availability
    • Until about 20 years ago, printed, physical journals were the main way in which scientific communication was disseminated
    • Now we conduct electronic literature searchers on specific subjects, using keywords, author names, and citation trees. As long as the papers are available digitally, they can be downloaded and read individually, regardless of the journal whence they came, or the journal’s IF.
    • This change in our reading patterns whereby papers are no longer bound to their respective journals led us to predict that in the past 20 years the relationship between IF and papers’ citation rates had to be weakening.
    • we found that until 1990, of all papers, the proportion of top (i.e., most cited) papers published in the top (i.e., highest IF) journals had been increasing. So, the top journals were becoming the exclusive depositories of the most cited research. However, since 1991 the pattern has been the exact opposite. Among top papers, the proportion NOT published in top journals was decreasing, but now it is increasing. Hence, the best (i.e., most cited) work now comes from increasingly diverse sources, irrespective of the journals’ IFs.
    • in their effort to attract high-quality papers, journals might have to shift their attention away from their IFs and instead focus on other issues, such as increasing online availability, decreasing publication costs while improving post-acceptance production assistance, and ensuring a fast, fair and professional review process.
    • As the relation between IF and paper quality continues to weaken, such simplistic cash-per-paper practices bases on journal IFs will likely be abandoned.
    • knowing that their papers will stand on their own might also encourage researchers to abandon their fixation on high IF journals. Journals with established reputations might remain preferable for a while, but in general, the incentive to publish exclusively in high IF journals will diminish. Science will become more democratic; a larger number of editors and reviewers will decide what gets published, and the scientific community at large will decide which papers get cited, independently of journal IFs.

Posted from Diigo. The rest of my favorite links are here.

From “designing teaching” to “evaluating learning”

Later this month we’ll be implementing a blended approach to teaching and learning in one module in our physiotherapy department. This was to form the main part of my research project, looking at the use of technology enhanced teaching and learning in clinical education. The idea was that I’d look at the process of developing and implementing a blended teaching strategy that integrated an online component, and which would be based on a series of smaller research projects I’ve been working on.

I was quite happy with this until I had a conversation with a colleague, who asked how I planned on determining whether or not the new teaching strategy had actually worked. This threw me a little bit. I thought that I had it figured out…do small research projects to develop understanding of the students and the teaching / learning environment, use those results to inform the development of an intervention, implement the intervention and evaluate the process. Simple, right?

Then why haven’t I been able to shake the feeling that something was missing? I thought that I’d use a combination of outputs or “products of learning” (e.g. student reflective diaries, concept mapping assignments, semi-structured interviews, test results, focus groups, etc.) to evaluate my process and make a recommendation about whether others should consider taking a blended approach to clinical education. I’ve since begun to wonder if that method goes far enough in making a contribution to the field, and if there isn’t something more that I should be doing (my supervisor is convinced that I’ve got enough without having to change my plan at this late stage, and she may be right).

However, when I finally got around to reading Laurillard’s “Rethinking University Teaching”, I was quite taken with her suggested approach. It’s been quite an eye opener, not only in terms of articulating some of the problems that I see in clinical practice with our students, but also helping me to realize the difference between designing teaching activities (which is what I’ve been concentrating on), and evaluating learning (which I’ve ignored because this is hard to do). I also realized that, contrary to a good scientific approach, I didn’t have a working hypothesis, and was essentially just going to describe something without any idea of what would happen. Incidentally, there’s nothing wrong with descriptive research to evaluate a process, but if I can’t also describe the change in learning, isn’t that limiting the study?

I’m now wondering if, in addition to what I’d already planned, I need to conduct interviews with students using the phenomenological approach suggested by Laurillard i.e. the Conversational Framework. I don’t yet have a great understanding of it but I’m starting to see how merely aligning a curriculum can’t in itself make any assertions about changes in student learning. I need to be able to say that a blended approach does / does not appear to fundamentally change how students’ construct meaning and in order to do so I’m thinking of doing the following:

  • Interview 2nd year and 3rd students at the very beginning of the module (January, 2012), before they’ve been introduced to case-based learning. My hypothesis is that they’ll display quite superficial mental constructs in terms of their clinical problem-solving ability as neither group has had much experience with patient contact
  • Interview both groups again in 6 months and evaluate whether or not there constructs have changed. At this point, the 2nd years will have been through 6 months of a blended approach, while the 3rd years will have had one full term of clinical contact with patients. My hypothesis is that the 2nd years will be better able to reason their way through problems, even though the 3rd years will have had more time on clinical rotation

I hope that this will allow me to make a stronger statement about the impact of a blended approach to teaching and learning in clinical education, and to be able to demonstrate that it fundamentally changes students constructs from superficial to deep understanding. I’m just not sure if the Conversational Framework is the most appropriate model to evaluate students’ problem-solving ability, as it was initially designed to evaluate multimedia tools.

Results of my Delphi first round

I’ve recently finished the analysis of the first round of the Delphi study that I’m conducting as part of my PhD. The aim of the study is to determine the personal and professional attributes that determine patient outcomes, as well as the challenges faced in clinical education. These results will serve to inform the development of the next round, in which clinical educators will suggest teaching strategies that could be used to develop these attributes, and overcome the challenges.

Participants from the first round had a wide range of clinical, supervision and teaching experience, as well as varied domain expertise. Several themes were identified, which are summarised below.

In terms of the knowledge and skills required of competent and capable therapists, respondents highlighted the following:

  • They must have a wide range of technical and interpersonal skills, as well as a good knowledge base, and be prepared to continually develop in this area.
  • Professionalism, clinical reasoning, critical analysis and understanding were all identified as being important, but responses contained little else to further explain what these concepts mean to them.

In terms of the personal and professional attributes and attitudes that impact on patient care and outcomes, respondents reported:

  • A diverse range of personal values that they believe have relevance in terms of patient care
  • These values were often expressed in terms of a relationship, either between teachers and students, or between students and patients
  • Emotional awareness (of self and others) was highlighted

In terms of the challenges that students face throughout their training:

  • Fear and anxiety, possibly as a result of poor confidence and a lack of knowledge and skills, leading to insecurity, confusion and uncertainty
  • Lack of self-awareness as it relates to their capacity to make effective clinical decisions and reason their way through problems
  • A disconnect between merely “providing a service” and “serving”
  • They lack positive and supportive clinical learning environments, have poor role models and often aren’t given the time necessary to reflect on their experiences
  • The clinical setting is complex and dynamic, a fact that students struggle with, especially when it comes to dealing with complexity and uncertainty inherent in clinical practice
  • Students often “silo” knowledge and skills, and struggle to transfer between different contexts
  • Students struggle with the “hidden culture” of the professional i.e. the language, values and norms that clinicians take for granted

These results are not significantly different from the literature in terms of the professional and personal attributes that healthcare professionals deem to be important for patient outcomes.

The second round of the Delphi is currently underway and will focus on the teaching  strategies that could potentially be used to develop the attitudes and attributes highlighted in the first round.

AMEE conference (day 3)

Today was the final day of AMEE 2011. Here are the notes I took.

The influence of social networks on students’ learning
J Hommes

Collaborative learning is supposed to facilitate interaction and it’s impact on student learning

Difficult to quantify the role of informal learning

Informal social interaction: behaviour is the result of interactions and relationships between people

Many variables can impact on student learning (e.g. motivation)

How does the effect of SN on students’ learning relate to possible confounders?

Methods:

  • Academic motivation scale (determine motivation)
  • College adaption questionnaire (determine social interactions)
  • GPA (previous performance impacts on future performance)
  • Factual knowledge test
  • Social network analysis (looked at Friendship, Giving information, Getting information)

Social interaction in informal contexts has a substantial influence on learning

Could it also be true that good learners are also well-developed social beings? If learning is inherently social, then people who are more social might just be better learners, and it has nothing to do with the social network?

Veterinary students’ use of and attitude toward Facebook
Jason Coe

Physicians share information on Facebook that could potentially upset their patients

People disclsoe more personal information on Facebook than they do in general

32% of students’ profiles contained information that could reflect poorly on the student or profession → venting, breaches of confidentiality, overtly sexual images / behavioural issues, substance abuse

78% of students believe that their profile pictures accurately reflected who they were at that time, 56% of students believed that their current profile pics accurately represents them as a future professional

More professionals believed that posting comments and pictures about clients on Facebook was acceptable, than students

Should professional students’ be held to a higher standard than other students?

Should Facebook information be used in hiring decisions?

An awareness of consequences causes students’ to disclose less on Facebook than they do in general

Individuals have a right to autonomy → education and guidelines can minimise risks

The issue of disclosure is important when it comes to using online social networks

Developing a network of veterinary ICT in education to suppor informal lifelong learning
S Baillie and P an Beukelen

Goals were to generate evidence of benefits and limitations of informal, lifelong learning using ICT

Questions in focus group that would affect participation in an online group:

  • What activities? Networking, finding information, asking questions, discussions
  • What motivations? Anonymity, sharing knowledge, convenience, saving time, travel and cost issues, required component
  • What support? Employer support, attitude, help desk, post moderator (reliable information)
  • What barriers? Time to participate, lack of confidence, lack of technical knowledge, understanding
  • What challenges? Poor site usability, professionalism issues / behavioural change

Was important to have behavioural guidelines for participation in the online network e.g. respect, etc.

Can YouTube help students in learning surface anatomy?
Samy Azer

Aim: to determine if YouTube videos can provide useful information on surface anatomy

For each video, the following was recorded:

  • Title
  • Authors
  • Duration of video
  • Number of viewers
  • Posted comments
  • Number of days on YouTube
  • Name of creator

No simple system is available for assessing video quality, but looked at (yes = 1, no = 0):

  • Content – scientifically corrent, images clear
  • Technical
  • Authority of author / creator (but how was this determined?)
  • Title reflects video content?
  • Clear audio quality
  • Reasonable download time
  • Educational objectives stated
  • Up to date creator information

57 out of 235 videos were deemed to be relevant, but only 15 of those were determined to have educational usefulness. Several videos were created by students and were often of a high quality

Conclusion was that YouTube is currently an inadequate source of information for learning surface anatomy, and that medical schools should take responsibility for creating and sharing resources online

Social media and the medical profession
Dror Maor

What is public and private? How do we separate out our personal and professional identities? Should we separate them out?

Discussion of the role of, and use of, social media by medical professionals (http://ama.com.au/node/6231)

Why do people think that using social media takes anything away from what we already do? Social media doesn’t take anything away from the hallway conversations…it’s not “better” or “worse” than “the old” way of doing things.

From “knowledge transfer” to “knowledge interaction” – changing models of research use, influence and impact
Huw Davies

Research, evidence and practice → moving from “knowing differently” to “doing differently”

There’s a lot of noise, but are we having any impact on practice? Who are we talking to? What kinds of conversations are we having? How can our collective input have an impact?

Currently, the model entails doing research, publishing it and hoping that clinicians change behavioural based on the results. No questions about how the knowledge transfer takes place?

How does knowledge “move around” complex systems?

The current system is too:

  • Simple
  • rational
  • Linear

Current outcomes are variable, inefficient, ineffective, unsafe, and sometimes, inhumane

Why is it that when we know more than ever before, do we perform so poorly within our healthcare systems?

  • Goals are ambiguous
  • Workforce is multiple
  • Environment is complex
  • Tasks are complex and ambiguous

Even though organisations are highly social, yet the belief is that caregivers act as they do because of personal knowledge, motives and skills

Major influences on outcomes are through the organisations and systems through which services are delivered, not individual characterstics (applies equally to educational outcomes)

Context matters → it’s situational, not dispositional (behaviour is as much about the context as it is about dispositions)

Reductive and mechanistic approaches only get us so far. “Rocket science” is merely complicated. Tackingly educational and health issues is genuinely complex because of connections of people, each with own unpredictable behaviours and contexts that changes over time in non-linear ways

Throwing information at people doesn’t generate appropriate responses / behaviours

For some, “evidence” is reduced to research on “what works”. Consequnces of this:

  • It’s relative straight-forward if the right methods are used
  • It provides instruction on what to do i.e. it allows us to make choices more easily
  • Assumes that the answers are out there to be found

Knowledge required for effective services is more broad than “what works”?

  • Knowing about the problems: their nature, inter-relationships, “lived experiences”
  • Knowing why: explaining the relationship between values and policies, and how they have changed over time
  • Knowing how: how to put change into practice, what is pragmatic
  • Knowing who: who should be involved, how do we build alliances, connect clinical and non-clinical

Challenge of integrating “knowledge”:

  • Uncertain process, engages with values, existing (tacit) knowledge, experience
  • socially and contextually situated
  • not necessarily convergent
  • may require difficult “unlearning”

Also, not just what knowledge:

  • Whose knowledge / evidence?
    “evidence” may be used selectively and tactically, use is not necessarily disinterested (evidence is what the powerful say it is)
    Knowledge and power are co-constructed

Knowledge is not “a thing”, is it a process of “knowing”?

Knowledge is what happens when you take data from research, and combine it with experience, and shared through dialogue

Uncovering evidence and understanding its complexity
Barry Issenberg

“If there’s evidence, I feel confident. If there’s no evidence, I’m uncomfortable”

Evidence is only useful if it meets the needs of the user. Who is the user?

Features of learning through simulation (BEME guide 4), a systematic review:

  • Feedback
  • Repetitive practice
  • Curriculum integration
  • Varying difficulty
  • Adaptive learning
  • Clinical variation
  • Controlled environments
  • Individualised learning
  • Defined outcomes

Discipline expertise doesn’t mean you can teach

Implementing clinical training in a complex health care system is challenging

Understanding the complexity of medical education → relationships between:

  • Learner characteristics, experiences, educational and professional context
  • Learning task: looked at psychomotor and procedural skills but behavioural not addressed
  • Instruction (deliberate practice under direct supervision in groups or individually, for as long as it takes)
  • Teacher characteristics and qualifications (these are not well-defined), clinical experience doesn’t equal teaching experience
  • Curriculum content and format, blend of presentations and practice sessions, expert demonstrations, orientation
  • Assessment: content and format
  • Enviroments should be supportive, needs to be infrastructure, time set aside
  • Evaluation of the programme: target, format, consequences (Kirkpatrick levels)
  • Society: politics and culture taken into account, patient safety, clinical opportunity, clinical advances
  • Setting: wide variety of settings, including schools, workplaces
  • Organisation: need to involve all stakeholders

Journals have a limited role to play in knowledge interaction, and appeal mainly to people who just want to do more research

Without context and explicit intention, medical education will never have the impact on society that it would like to (Charles Boelen)

 

Applying theoretical concepts to clinical practice

Concept map about concept mapping taken from IHMC website

I just finished giving feedback to my students on the concept mapping assignment they’re busy with. It’s the first time I’ve used concept mapping in an assignment and in addition to the students’ learning, I’m also  trying to see if it helps me figure out what they really understand about applying the theory we cover in class to clinical contexts. They’re really struggling with what seem to be basic ideas, highlighting the fact that maybe the ideas aren’t so basic after all. I have to remind myself that clinical reasoning is a skill that takes many years to develop through reflection and isn’t really something I can “teach”. Or is it?

For this assignment I wanted the the students to set a learning objective for themselves (I gave examples of how to do this, including using SMART principles of goal setting). They also needed to highlight a particular clinical problem that they wanted to explore and how they would use concepts from the Movement Science module to do this. They needed to describe a clinical scenario / patient presentation and use it to identify the problem they wanted to explore. From that short presentation, they should derive a list of keywords that would become the main concepts for the concept map.

Here’s a list of the most common problems I found after reviewing their initial drafts:

  • Many of them lacked alignment between the patient presentation, the learning objective, keyword / propositions and the final concept map
  • Many of the learning objectives were vague. They really found it hard to design appropriate learning objectives, which meant that their whole assignment was muddled
  • There were two processes going on in the students’ minds: patient management, and their own learning. This assignment was about student learning, but most of the students were focused on patient management. This was especially clear in the learning objective and actual maps they created, which all had a clinical focus on the interventions they would use to treat the patient, rather than the learning concepts they would apply
  • Most of the students created hierarchical maps which failed to identify complex relationships between concepts

After going through their initial drafts, I had another session with them to go through the feedback I’d given and providing more examples of what I expected from them. This assignment is proving far more difficult for the students than I’d expected. However, I’m not sure if it’s because they can’t apply theoretical concepts to clinical scenarios, or if they just don’t have a good understanding of how to create concept maps. I think that they’re having difficulty thinking in terms of relationships between concepts. The maps they’ve been drawing are appropriate in terms of the interventions they’d choose to manage their patients, but the students can’t seem to transfer the concepts from the classroom into clinical contexts.

They’re used to memorising the content because that’s how we assess them i.e. our assessments are knowledge-based. Then they go into clinical contexts and almost have to re-learn the theory again in the clinical environment. There doesn’t seem to be much transfer going on, in terms of moving knowledge from the classroom context to the clinical one. I haven’t researched this yet, but I wonder what sort of graduate we’d get if we scrapped classroom teaching altogether and just did everything on the wards and in the clinics? I understand the logistical issues of an apprentice-based approach to teaching large groups but if we didn’t have classroom time at all, maybe it’d be possible?

Posted to Diigo 09/19/2010

  • Knowledge in complex settings is a process of negotiation…an interplay of entities…a dance. And being knowledgeable in these settings requires an awareness of process and flow, not of being in possession of “knowledge”
  • What has happened in journalism will also happen in education: breakdown of a single controlled narrative, increased role of amateurs, challenges to the existing business model, etc
  • How should teaching and learning be structured in a networked world?
  • Should Africa (or any region of the world) duplicate the educational system of Europe or North America? Should Africa adopt the curriculum of these regions? How should teaching and learning be delivered? How many schools should be built? What is the cost of building the physical support structures for learning and knowledge for a region like Africa? Is there a better way? What are the costs of building a technological infrastructure – internet connectivity and computers – in comparison to building schools and purchasing textbooks? (it’s not an either or question – effective learning with technology from my experience, involves a blend of online and face-to-face).
  • We need to throw out most or our assumptions of learning systems, content, learning design and delivery in order to build the future of Africa’s learning and knowledge infrastructure
  • Ingenuity and creativity from within Africa will address this challenge – it’s not something that development agencies should “do for Africans”
  • The learning process is less uncertain. How will the next generation of Africans be educated? What is the learning model that will fulfill this urgent, foundational, task?
  • Right now, educational content flows into Africa which creates an external cultural injection. African educators have an opportunity to create a content/cultural outflow from Africa by increasing collaboration with each other and producing open content for other educational systems in the world to utilize. Open content is not enough. We need to open up the learning system as a whole to the benefits of participation, socialization, networks, and peer interaction
  • Education in Africa, like many other systems in the world, would benefit enormously from a shift to social participative networked learning
  • Two-critical questions need to be answered by anyone who wants to adjust the education system: 1. What does technology now do better than people can? 2. What can people do better than technology?
  • Content duplication, scaling, and reproduction are far better managed by technology. One recorded lecture can be seen a thousand times online without significant increase in expense. The content broadcast of any course can be opened and shared online fairly easily, using simple tools like Skype, ustream, or Elluminate. Duplicating content – where we are now with open educational resources is easy and cheap.
  • the social dimensions of learning are still best managed by humans
  • Sugata Mitra has demonstrated the value of peer and self-directed learning in India
  • In Africa, the foundational learning and knowledge development that must take place to break the cycle of crisis and urgency can best be met through social participative networked learning. In this model, educators can take advantage of the scalability of open content, the broadcast potential of lectures and recordings, and the social interactive potential of large-scale peer-based learning
  • Traditional educational models simply cannot scale rapidly enough
1. It should be based on a unit of influence that is at the control of each individual (i.e. connections not networks)
2. Scale social interactions (not only content) so large network learning occurs, but in a way that permits various group/collective sizes
3. Promote and benefit from learner autonomy, helping learners to building skills and capacity for ongoing learning
4. Use distributed, decentralized technical infrastructure (p2p not centralized)
5. Extensively use learning analytics, preferably blurring physical and virtual interactions
6. Use curriculum intelligently (linked data/semantic web) in order to provide learners with personal and adaptive paths
7. Allow information splicing so that flows can be adjusted and organized to reflect different learning and social tasks
8. Enable easy variance of contexts – or as my colleague Jon Dron states – “context switching”.
9. Offer varying levels of support and structure, under the control of the learner. If a subject is too challenging, learners can choose a structured learning path. Or, if learners prefer greater autonomy, more flexible paths can be adopted.
10. The system needs to learn from the learners (Hunch is a good example)
11. Integrate activities from various services so learners can centrally interact with data left in other services (Greplin)
12. Provide learners with the tools to connect and form learning networks with others in a course and across various disciplines (diversity exposure to ideas and connections needs to be intentional)