Mixed methods research: John Cresswell seminar


For me, mixed methods research (MMR) is about using qualitative and quantitative data to strengthen an argument that is difficult to support with only one type of data. It’s about bringing together the numbers (quantitative) and stories (qualitative) to gain a more complete understanding of the world (research problems and questions). We often think of those two approaches as being separate and distinct, but when combined they produce something greater than the sum of the two parts. Earlier this year we had the opportunity to attend a seminar by John Cresswell & Tim Guetterman. Here are my notes.

Introduction to Mixed Methods Research

Practical uses of mixed methods research:

  • Explaining survey results
  • Exploring the use of new instruments in new situations
  • Confirming quantitative results with qualitative findings (why is it often the quantitative component that comes first; any situations where the quantitative could be used to explain the qualitative results?)
  • Adding qualitative data into experiments
  • Understanding community health research
  • Evaluating programme implementation

What are the major elements of MMR?

  • A methodology (popular way of conducting research)
  • Collecting and analysing quantitative and qualitative data
  • Integrating different sets of data
  • Framing the study within a set of procedures (called mixed methods designs)
  • Being conscious of a philosophical stance and theoretical orientation

Quantitative data collection (closed ended) makes use of instruments, checklists and records. Quantitative data analysis uses numeric data for description, comparison, relating variables

Qualitative data collection (open ended) uses interviews, observations, documents and audio-visual materials. Data analysis revolves around using text and image data for coding, theme development, and then relating themes.

What does “integration” mean? We can do this by merging (using one set of data with another), connecting (using one set of data to explain or build on another), or embedding (quan within qual, or qual within quan) the data.

What is MMR not?

  • Reporting quan and qual data separately (they should be combined)
  • Using informal methods (it is systematic)
  • Simply using the name (it must be rigorous)
  • Collecting either multiple sets of quan or qual data (i.e. not multimethod research, must collect both quan or qual sets of data)
  • Collecting qual data and then quantitatively analysing it (instead of content analysis, collecting both forms of data)
  • Simply considering it an evaluation approach (it is a complete methodology)

Specific benefits of MMR:

  • Quan to qual: make quan results more understandable
  • Qual to quan: understand broader applicability of small-sample qual findings
  • Concurrent: robust description and interpretation of multiple sets of different data

Popular mixed method designs:

  • Basic: convergent (bringing qual and quan data together), explanatory sequential (use one set of data to explain more clearly another set of data), exploratory (use qual data to develop something quantitatively that leads to an intervention design)
  • Advanced: intervention, social justice, multistage evaluation

Research questions related to MMR

  • Convergent design: To what extent to the quan and qual results converge?
  • Explanatory design: In what ways to the qual data help to explain the quan results?
  • Exploratory design: In what ways do the quan results generalise the qual findings?

How do we display quan and qual results together (joint display)? Lots of variation in how both sets of data can be presented. MaxQDA is an application that can be used to analyse and display different sets of data.

How do we publish MMR? Consider publishing the different sets of data in different papers.

How do we link writing structure to design? Writing about and publishing mixed methods research may require different approaches to article structure and style of writing.

The importance of qualitative research in mixed methods

Key features of qual research:

  • Follows the scientific method
  • Listening to participant views
  • Asking open ended questions
  • Build understanding based on participant views
  • Developing a complex understanding of the problem
  • Go to the setting to gather data
  • Be ethical
  • Analyse the data inductively – let the findings emerge
  • Write in a user-friendly way
  • Include rich quotes
  • Researcher presence in the study (reflexivity)

Types of problems that qual research is suited to:

  • A need to explore a context
  • When it is important to listen
  • Unusual / different culture
  • Don’t know the questions to ask
  • Understanding a process
  • Need to tell a story

How do our backgrounds inform the way we interpret the world? There is an element of reflexivity and an understanding that data interpretation is dependent on our individual personal and professional contexts.

Writing a good qual purpose statement:

  • Single sentence, often in the form of “The purpose of this study…”
  • A focus on one central phenomenon
  • “Qualitative words” e.g. explore, describe, understand, develop
  • Includes participants and setting

Understanding a central phenomenon:

  • Quan: explaining or predicting variables
  • Qual: understanding a central phenomenon

Data collection:

  • Sampling (purposeful)
  • Site selection (gatekeepers, permissions)
  • Recruitment (incentives)
  • Types of data (observation, interview, public/private documents, audio-visual)

Interview procedures:

  • Create a protocol
  • 5-7 open ended questions (first question is easy to answer e.g. participant role or experience; last question could be “Who else should I speak to in order to get more information about this?”)
  • Allows the participant to create options for responding
  • Participants can voice their experiences and perspectives
  • Record and transcribe for analysis


  • Observation protocol
  • Descriptive notes (portrait of informant, setting, event) and reflective notes (personal reflections, insight, ideas, confusion, hunches, initial interpretation)
  • Decide on observational stance (e.g. outsider, participant, changing roles)
  • Enter site slowly
  • Conduct multiple observations
  • Summarise at the end of each observation

Types of audio-visual material:

  • Physical trace evidence
  • Videotape or film a social situation, individual or group
  • Examine website pages
  • Collect sounds
  • Collect email or social network messages
  • Examine favorite possessions or ritual objects

How to code data:

  • Read through the data (many pages of text)
  • Divide text into segments (many segments of text)
  • Label segments of information with codes (30-40 codes)
  • Reduce overlap and redundancy (reduce codes to 20)
  • Collapse codes into themes (reduce codes to 5-7 themes)

A good qual researcher can identify fine detail but also step back and see the larger themes

How to write a theme passage:

  • Use themes as headings
  • Use codes to build evidence for themes
  • Use quotes and sources of information to demonstrate themes

Writing up the qual study:

  • Description
  • 5-7 themes
  • Use codes and quotes to support themes
  • Tell a good story

Five approaches to qual research:

  • Narrative (comes out of literature)
  • Phenomenology (psychology)
  • Grounded theory (sociology)
  • Ethnography (anthropology)
  • Case study
  • Can also include discourse analysis, participatory approaches

Ethical issues:

  • Respect the site, develop trust, anticipate the extent of the disruption
  • Avoid deceiving participants, discuss purpose
  • Respect potential power imbalances
  • Consider incentive for participants

MAXApp is a mobile app for collecting data on Android and iOS devices:

  • Take photos
  • Write memos
  • Audio recording
  • Location data (geotagging)
  • How is this different to something like Evernote? If you’re already using MAXQDA, it offers integration with the desktop client. If you use another data analysis package, then MAXApp may not be as useful.

Relevant readings

assessment learning physiotherapy students teaching workshop

Teaching and learning workshop at Mont Fleur

Photo taken while on a short walk during the retreat.

A few weeks ago I spent 3 days at Mont Fleur near Stellenbosch, on a teaching and learning retreat. Next year we’re going to be restructuring 2 of our modules as part of a curriculum review, and I’ll be studying the process as part of my PhD. That part of the project will also form a case study for an NRF-funded, inter-institutional study on the use of emerging technologies in South African higher education.

I used the workshop as an opportunity to develop some of the ideas for how the module will change (more on that in another post), and these are the notes I took during the workshop. Most of what I was writing was specific to the module I was working with, so these notes are the more generic ones that might be useful for others.


Content determines what we teach, but not how we teach. But it should be the outcomes that determine the content?

“Planning” for learning

Teaching is intended to make learning possible / there is an intended relationship between teaching and learning

Learning = a recombination of old and new material in order to create personal meaning. Students bring their own experience from the world that we can use to create a scaffold upon which to add new knowledge

We teach what we usually believe is important for them to know

What (and how) we teach is often constrained by external factors:

  • Amount of content
  • Time in which to cover the content (this is not the same as “creating personal meaning”)

We think of content as a series of discrete chunks of an unspecified whole, without much thought given to the relative importance of each topic as it relates to other topics, or about the nature of the relationships between topics

How do we make choices between what to include and exclude?

  • Focus on knowledge structuring
  • What are the key concepts that are at the heart of the module?
  • What are the relationships between the concepts?
  • This marks a shift from dis-embedded facts to inter-related concepts
  • This is how we organise knowledge in the discipline

Task: map the knowledge structure of your module

“Organising knowledge” in the classroom is problematic because knowledge isn’t organised in our brains in the same way that we organise it for students / on a piece of paper. We assign content to discrete categories to make it easier for students to understand / add it to their pre-existing scaffolds, but that’s not how it exists in minds.

Scientific method (our students do a basic physics course in which this method is emphasised, yet they don’t transfer this knowledge to patient assessment):

  1. Observe something
  2. Construct an hypothesis
  3. Test the hypothesis
  4. Is the outcome new knowledge / expected?

Task: create a teaching activity (try to do something different) that is aligned with a major concept in the module, and also includes graduate attributes and learning outcomes. Can I do the poetry concept? What about gaming? Learners are in control of the environment, mastering the task is a symbol of valued status within the group, a game is a demarcated learning activity with set tasks that the learner has to master in order to proceed, feedback is built in, games can be time and resource constrained

The activity should include the following points:

  • Align assessment with outcomes and teaching and learning activities (SOLO taxonomy – Structured Observation of Learning Outcomes)
  • Select a range of assessment tools
  • Justify the choice of these tools
  • Explain and defend marks and weightings
  • Meet the criteria for reliability and validity
  • Create appropriate rubrics

Assessment must be aligned with learning outcomes and modular content. It provides students with opportunities to show that they can do what is expected of them. Assessment currently highlights what students don’t know, rather than emphasising what they can do, and looking for ways to build on that strength to fill in the gaps.

Learning is about what the student does, not what the teacher does.

How do you create observable outcomes?

The activity / doing of the activity is important

As a teacher:

  • What type of feedback do you give?
  • When do you give it?
  • What happens to it?
  • Does it lead to improved learning?

Graduate attributes ↔ Learning outcomes ↔ Assessment criteria ↔ T&L activities ↔ Assessment tasks ↔ Assessment strategy

Assessment defines what students regard as important, how they spend their time and how they come to see themselves as individuals (Brown, 2001; in Irons, 2008: 11)

Self-assessment is potentially useful, although it should be low-stakes

Use a range of well-designed assessment tasks to address all of the Intended Learning Outcomes (ILOs) for your module. This will help to provide evidence to teachers of the students competence / understanding

In general quantitative assessment uses marks while qualitative assessment uses rubrics

Checklist for a rubric:

  • Do the categories reflect the major learning objectives?
  • Are there distinct levels which are assigned names and mark values?
  • Are the descriptions clear? Are they on a continuum and allow for student growth?
  • Is the language clear and easy for students to understand?
  • Is it easy for the teacher to use?
  • Can the rubric be used to evaluate the work? Can it be used for assessing needs? Can students easily identify growth areas needed?


  • What were you evaluating and why?
  • When was the evaluation conducted?
  • What was positive / negative about the evaluation?
  • What changes did you make as a result of the feedback you received?

Evaluation is an objective process in which data is collected, collated and analysed to produce information or judgements on which decisions for practice change can be based

Course evaluation can be:

  • Teacher focused – for improvement of teaching practice
  • Learner focused – determine whether the course outcomes were achieved

Evaluation be conducted at any time, depending on the purpose:

  • At the beginning to establish prior knowledge (diagnostic)
  • In the middle to check understanding (formative) e.g. think-pair-share, clickers, minute paper, blogs, reflective writing
  • At the end to determine the effectiveness of the course / to determine whether outcomes have been achieved (summative) e.g. questionnaires, interviews, debriefing sessions, tests

Obtaining information:

  • Feedback from students
  • Peer review of teaching
  • Self-evaluation


  • Knight (n.d.). A briefing on key concepts: Formative and summative, criterion and norm-referenced assessment
  • Morgan (2008). The Course Improvement Flowchart: A description of a tool and process for the evaluation of university teaching