Lambert Schuwirth : Where are we now with Competency Based Training in Healthcare?

Like so many other disciplines medical education has its vogues; topics, ideas, approaches that become popular so quickly and massively that some refer to them as ‘hypes’.

 

I must confess that I too think that some developments in medical education are hypes, and I cannot escape the impression that some are solutions in search of a problem. Actually I am not happy with such hypes because they make it more difficult to defend medical education as a serious and scientific discipline and do harm to my discipline.

 

Sometimes they even lead to very costly mistakes. The epitome of such a mistake has been the use of long branched scenario simulations for high-stakes, summative testing of clinical reasoning in the 1970s – 1980s. What seemed to be a good and valid approach turned out to be inherently flawed, but that was only found out after millions of dollars had been spent on projects and numerous candidates had been failed based on the use of faulty assessments. This, by the way, was nothing to do with a lack of computers, virtual reality or imagination power, but due to a lack of understanding of what clinical reasoning actually is, – in other words a lack of fundamental research. But this is long behind us and medical education has progressed as a discipline.

 

So where do we put the development of competency-based education then? Is it in the ‘hype’-basket or in the ‘genuine step forwards’ basket? I would argue that it is a step in the right direction but I would also argue that we are not there yet. Let me explain why I think this way.

 

Traditional Model

A typical traditional model of medical competence describes it as the combination of knowledge, skills, problem-solving ability and attitudes, assuming that each of them could be separately taught and separately assessed. Each of these four pillars was more or less seen as a personality trait: a relatively stable and relatively generic trait. So, for example, one could have good problem-solving ability and no knowledge or vice versa. In addition, it was assumed that for each of them there was one single best assessment method; vivas, for example, would best for clinical reasoning, OSCEs best for skills, etc.

 

Unfortunately, robust research findings do not support this intuitive model. Scores, for example, generalise much better within content across formats; one study showed that a knowledge test on skills correlated much better with an actual OSCE station (given the same content domain) that two different OSCE stations within the same examination. Numerous studies demonstrated that validity of an assessment is determined by what is asked, rather than whether it is an open-ended or multiple choice question. These findings and others made that the traditional model with ‘knowledge’, ‘skills’, ‘problem-solving ability’ and ‘attitudes’ was no longer tenable and a new and more integrative view on competence had to be developed; one that is based on the finding that content is important and not the format, one that seeks to combine information that is content-similar and not information that is format-similar.

 

In addition to this, medical curricula were often structured according to the healthcare organisation. In this structure, a competent graduate is seen as someone who has successfully completed all individual discipline-based examinations and successfully navigated all discipline-based clerkships, but without integration between them.

 

Of course, patients are not naturally ‘structured’ the way healthcare is structured; they may have undifferentiated complaints, combinations of morbidity, polypharmacy, etc.  So, here too there is a need to reorganise our curricula preferably with the patient as the integrative factor and not the organisation of secondary and tertiary healthcare.

 

Competency Based Learning

Competency-based education has been suggested as a way out of these dilemmas. It suggest that the structure of education and the definition of outcomes is better organised within content themes rather than a sort of personality traits putting the patients and the management of patients at the heart of the process. Competencies, although they have been defined in numerous ways, can generally be seen as simple to complex tasks a successful candidate must be able to handle and during which s/he uses at the right time the correct and relevant knowledge, skills, attitudes and meta-cognitions to manage the situation successfully.

 

Are we there yet? No, there is still considerable ground that has to be covered before we have truly competency-based education and some changes in thinking may need to take place.

 

The first is a change from a reductionist to a holistic view. Often our education and assessment deconstruct competence into little teachable and assessable pixels and then struggles to put them together to see the whole picture. That is why arbitrary decisions have to be made about cut-off scores, about how to combine assessment results, etc. In a more holistic approach the whole picture is described first and when needed the individual pixels can be inspected. For assessment this means that competence is not a combination of measurements, but a narrative in which measurement information is used. This is not unlike a patient’s condition: which is a narrative in which lab values can be used.

 

The second is a change from linear dynamic to more complex views. So many essential aspects of competence cannot be modelled in a one-size-fits all model. Feedback is not given exactly the same way to all students in all situation but is an adaptable and largely unpredictable process. The same applies to healthcare, our students should not be taught simple tricks. They are not expected to deal with every patient in every situation in exactly the same way, but need to be adaptable and agile as to cater to every – often unexpected – situation. No doctor can predict exactly what will be said 2 minutes and 45 seconds into the consultation but they can be pretty confident that their reaction will be correct and they will not overstep boundaries. Our education and assessment approaches will increasingly need to reflect this complexity/uncertainty.

 

The third is do away with our fear for qualitative information. In assessment, specifically, it is often thought that numbers are objective and therefore reliable. Even in research the typical quantitative article still contains many more words than numbers, because the words are needed to make sense of the numbers.  Even the often quoted Lord Kelvin who argues that “numbers or measurements are better than words” apparently needed words to make that claim (it was not a mathematical derivation but a rhetorical argument). I am not saying that we need to do away with numbers, quite the opposite. But, we do need to understand that the collection of data can happen objectively but any interpretation involves subjective human judgement, and we should not fear it. Is this a plea to be arbitrary and capricious with assessment and education? No it is not, as much as objectivity does not imply reliability nor does subjectivity imply unreliability, and much has been written about this.

 

So in summary, education and assessment need to be of high quality, transparent, meaningful and valid, and the development towards competency-based education and competency bases (programmatic) assessment are a step closer towards optimising the quality of the education of our future doctors.

 

 

About the author

Lambert Schuwirth is a Professor of Medical Education and Director Prideaux Centre for Research in Health Professions Education at Flinders University in Adelaide, Australia.

 

“How many of these have you done?”

A recent opinion piece in the Journal of the American Medical Association drew attention to the issue of procedural experience in healthcare.

 

Titled “How many have you done?”, the piece described the experience of a doctor who required a procedure herself, in this case, an amniocentesis.

 

Of course, what the patient was really asking is, “How can I be reassured you know what you are doing?”

 

The thrust of the piece was that the training doctor who performed the procedure had felt compelled to misrepresent their experience with the technique, deftly deflecting questions by the patient and her partner that explored his competence.  The author calls for a more honest response to these types of questions, while acknowledging that this is often difficult to do.

 

But is it any wonder a young doctor has trouble answering this question?

 

Healthcare continues to battle with the issue of competency.  It is still rare for doctors to be formally certified to perform specific procedures.  In fact, the industry still does not have a shared understanding of what competency actually is!

 

Furthermore, because it is uncommon for doctors to assiduously record their activity and outcome data, and even more rare for them to benchmark against their peers, most clinicians are simply oblivious to performance level.

 

So when patients are searching for reassurance that they will be cared for as best they can be, most of us struggle to be clear and meaningful in our response.  Because most of the time, we just don’t know.

 

Wouldn’t it be much better for the junior doctor to answer with authority?

 

“Well, I’ve completed a recognized pathway and been certified to practice after a period of supervision by experts.  Furthermore, I continuously review my performance results and feel comfortable that I’m doing well.”

Osler Community Survey – Results

Thanks to all the respondents to our first Osler Community Survey!  The response rate we received was highly encouraging, and the feedback quite insightful in relation to professional development, clinical governance, credentialing, logbooks and desirable Osler product features.   We look forward to putting the input and feedback received to good use in the next versions of Osler and we will keep you apprised of the upcoming launch of Osler Community later this year via our newsletter and here on the blog.

 

Congratulations to N Kumta (Australia), M Hoops (Australia) and DP Bowles (UK), our three winners of the free 12 month subscription to Osler!

 

The infographic below provides a summary of your feedback and our findings.

Osler Clinical Performance

 

Or, you can download it from the link below:

Osler Survey (June 16) Results