Osler’s novel clinical governance platform recognised again

Osler Technology is delighted to be recognised by the industry-leading Learning Technologies Awards in London in November 2016.  LTA is long-established as the peak awards within the online learning and training community, and Osler’s clinical governance platform placed second in the Novel Use of Technology – International category.

Learning Technologies Awards has recognised Osler's clinical governance platform

The award reinforces the sound learning principles upon which the platform is based.  Ensuring learners have access to high quality feedback, in a format they can use to reflect on easily, is critical for busy professionals.  The platform also incorporates an Entrustable Professional Attributes model of assessment, designed to better identify learners who need more assistance.

Work-readiness was a key theme in healthcare clinical governance this year, with the release of a stream of reports and papers identifying healthcare error as a major source of patient morbidity.  Osler is an immediate solution to many common governance issues.

2016 was a big year for Osler, with the completion of our commercialisation phase, the implementation to our first set of customers, successfully applying for grant funding from Federal and State Governments, and recognition in international awards and conferences.  But it’s only the beginning…

We look forward to working with all our supporters and customers in 2017

Hospital Safety Report Provides Opportunity for Innovation

A new report into managing critical safety risks in the hospital system makes important recommendations and highlights the need to prevent patient injury and death in the first place, says a high-tech medical start up.


The just-released Duckett review into DHHS management of incidents, such as a spate of perinatal deaths in Victoria in 2013-4, underlines the ongoing national tragedy of 1800 Australians who die each year and another 6800 who are impacted by such adverse events.


Osler Technology, the brain child of intensive care and retrieval medicine specialist Dr Todd Fraser, provides real time, meaningful data on patient outcomes within the clinical workspace, and enables clinician leaders to respond rapidly and creatively to prevent future events.


By tracking individual training and activity data, it ensures all staff practice within their scope.


“The recommendations of the Duckett review create a better environment for patient safety across the wider health system but this is only half the battle. Left unanswered is how health systems prevent patient injury in the first place,” said Dr Fraser.


“If meaningful change is to be made, clinical staff will be the ones to make it, and they can only do so with the right tools and information at their disposal. This is where a system like Osler plays a role by taking the high level principles identified in the Duckett review and applying them at individual patient, clinical manager and clinician levels.”


Read the full media release here:


Our approach to clinical governance needs rethinking

Stephen Duckett’s report highlights need for new thinking on clinical governance


The recently released Duckett Report was commissioned in response to a review of the role of DHHS in detecting and managing critical safety risks and clinical governance across the system following a cluster of perinatal deaths at Djerriwarrh Health Service in 2013-14.


The report highlights that while Victorians have a right to assume that healthcare is generally of a high quality across the system, there continue to be significant deficiencies in the system’s defenses against avoidable patient injury.


The report cites a 2014-15 review of hospital acquired diagnoses in Victorian Healthcare System, which concluded that “complications of care are far from rare in our hospitals”.  In fact, more than 300,000 patients per year suffer a complication in Victorian hospitals, at least 70,000 of which are potentially preventable such as malnutrition or pressure ulcers. Many of these result in fatalities.


This issue, to say the least, is huge.


The Duckett report was commissioned to review the role of the DHHS in preventing these events.


Essentially, the report calls for the department to better support Victorian Health Services in providing a high level of local clinical governance on safety and quality, to monitor outcome data more closely, and to respond more effectively when things go wrong.

Major Findings

To this end, the report makes several important recommendations, among them :


  1. The department must set a clear example to the wider health service that this issue is its number one priority
  2. The system should focus less on “meeting accreditation standards”, and instead greater emphasis placed on outcome monitoring
  3. The department better supports the boards of health services by reviewing the appointment and training processes for board members, to ensure they can effectively oversee hospital governance
  4. Performance of the health service is more effectively monitored, by making better use of available data and filling in gaps as they exist
  5. Improvements are made in the utilization of data, so that the entire system can benefit, and learnings are better shared
  6. All hospitals should be open to periodic external review
  7. That hospitals are held to account in only providing care that falls within its scope of capability
  8. Consumers and front line clinicians must have a louder voice in the quality assurance process.


What does this all mean?

Promoting a culture of transparency and accountability, and most importantly trust, are an excellent start, and to this end, the department is to be congratulated for the example it has set.


As Dr Duckett himself points out, the department acted immediately to support Djerriwarrh protect its patients, investigate the cluster of deaths and engage in an open disclosure process.  It then sought prompt external review of its own role in the process and made those results public immediately.  It is a high level demonstration of the transparent accountability required at clinician level.


The recommendations of the Duckett review create a better environment for patient safety across the wider health system but this is only half the battle. Left unanswered is how health systems prevent patient injury in the first place.  Albeit beyond the scope of the review, here lies the rump of the improvement curve – what are the hospitals and health services, their clinical managers and individual clinicians supposed to do to improve patient care?  The report provides few answers.


If meaningful change is to be made, clinical staff will be the ones to make it, and they can only do so with the right tools and information at their disposal.


Without appropriate tools, processes and culture in place, no amount of oversight will achieve the department’s lofty goal of zero preventable patient injury.  This is where ready-made systems like Osler plays a role by taking the high level principles identified in the Duckett review and applying them at individual patient, clinical manager and clinician levels.

How can Osler contribute to improved clinical governance?

Osler provides an opportunity for hospitals to be proactive in their patient safety efforts.  As Dr Duckett points out, hospitals should be operating within their defined scope of practice.  The problem is, few clinical managers have sufficient granular visibility of their activity to enable this to occur.  Using Osler to ensure all staff are adequately trained to perform invasive procedures, non-technical skills and basic equipment familiarity helps manage this clear and apparent risk.


By providing real time, meaningful and comparative data on clinical proficiency, complication rates and currency or practice, Osler enables hospitals to identify and respond to limitations in service levels and patient care.


And by creating a collaborative environment for clinicians, Osler can distribute these essential learnings across the healthcare system so that Victorians, indeed all Australians can be treated in a safer manner.



“You have to be willing to acknowledge your problems before you can remedy them. If I were to characterise the state of public and private hospital care in the state of Victoria, I’d have to say that this first step is lacking. Both the public and private hospital systems and the government regulators who oversee them are in a state of denial with regard to the level of harm being caused to the public by inadequate attention to quality and safety deficiencies.”

Paul Levy

former president and CEO of Beth Israel Deaconess Medical Centre in Boston, Massachusetts

Deakin University, Thinker in Residence, 2016

Training in Ultrasound – In the kingdom of the blind, the one-eyed man is King

In this special guest blog post, Dr Adrian Wong discusses the challenges surrounding the introduction and implementation of a new technology in healthcare, bedside ultrasonography.



The application of ultrasound beyond the realms of the Radiology department is well and truly established. Ultrasound has evolved into an indispensable tool in the physicians’ armament – providing diagnostic, monitoring and procedural guidance within a neat package. Acute physicians, ED doctors, anaesthetists and critical care physicians have all embraced ultrasound as an essential part of their role.


The key to utilizing ultrasound successfully, in the hands of such a diverse group of specialties, lies in asking the right questions. Hence the development of focused examinations. Focused echocardiography is probably the best example of the use of ultrasound, permitting non-cardiologists to answer questions immediately relevant to their area of practice. The ability to confidently rule out (or in) pericardial tamponade in cardiac arrest or pneumothorax in trauma is pivotal in patient management. A word of caution though, as POCUS examinations are usually performed in a time-sensitive environment, getting it wrong can have significant repercussions. Urban legends such as a patient being thrombolysed because the LV was mistaken for the RV or a ‘leaking AAA’ taken straight to theatre only to reveal a normal caliber aorta are whispered in corridors as a reminder to the budding POCUSologist.


The proven clinical benefits of point-of-care ultrasonography has led to ongoing expansion of its role into uncharted areas. Whilst obviously exciting, this raises the issue of training and competency (to perform, interpret and act upon results). How best to become competent in ultrasonography makes for interesting and sometimes divisive conversation.


Reflecting on personal experience, my interest in POCUS coincided with the launch of CUSIC (Core Ultrasound Skills in Intensive Care), the UK’s own POCUS programme. A handful of centres in the UK offered fellowships with qualified trainers and suitable training opportunities. Apart from the guidance of experienced colleagues, my training was supplemented with online FOAMed resources. Videos recorded and shared (available free of charge) by esteemed teams of individuals such as @5minsono and @ultrasoundpod were instrumental in my professional development. Since then, the number of courses and fellowships available have continued to expand. I now help run our department’s POCUS fellowship and hence the issue of training is never far from my mind.


When one considers all the possible modules under the umbrella term of point-of-care ultrasound (POCUS) e.g. echocardiography, abdominal, etc. the concept of training and competency becomes even more nebulous. There are numerous POCUS accreditation programmes available from a variety of bodies. The ACCP, ESICM and ICS (UK) have developed their own programmes, all of which contain some overlapping similarities. As adult learners have different styles of learning, there is no single best way to learn the skill of POCUS.


As an example, the BSE (British Society of Echocardiography) accreditation for critical care requires a theoretical and practical examination with a logbook of 250 appropriate cases. In contrast, FICE (Focused Intensive Care Echocardiography) accreditation requires attendance at a course, a logbook of 50 cases and a triggered assessment. These two accreditations obviously differ in their resulting skillsets and breadth of clinical scanning experience, but this highlights the variation in training requirements for the module of echocardiography in critical care. Furthermore, BSE requires a regular logbook of cases to maintain accreditation, whereas no formal processes are currently in place to maintain FICE accreditation. In practice, any clinician with BSE or FICE accreditation is able to perform day-to-day echocardiography in an intensive care setting (although awareness of one’s own limitations is crucial in more complex cases).


Generally speaking, all the accreditation programmes are divided into theoretical knowledge and practical skills.


The theoretical component is generally comprised of basic physics, anatomy and a description of textbook views and pathology. A diagnostic algorithm is also introduced. This component can be delivered online or in person at courses. Each has its advantages and drawbacks.


Arguably the more important component of training is the practical aspect. This is where a face-to-face course/apprenticeship provides a useful starting point. Having an expert guide you through the scan – how to position the patient, adjust probe orientation etc is invaluable.


After the course, the accreditation systems available diverge. The UK’s POCUS accreditation (CUSIC) requires a specified number of scans in the presence of a mentor. It is the expectation of this programme that the supervised scans are performed until a minimum number has been achieved, thereafter triggering an assessment phase. Such an apprenticeship model is labour-intensive compared to other programmes which perhaps requires the uploading of scans onto an online logbook for review.


A recent survey of ESICM members showed that the main barrier to attaining POCUS training is the lack of trainers (personal communication). If there are insufficient trainers, the rollout of training and assessment will be limited / delayed. As mentioned above, an online platform (with minimal face-to-face interaction) is certainly one way of tackling the problem of a limited trainer base. But does this approach cheat the learner of the invaluable mentorship process? How do we ensure that the end product is a confident and competent physician which will ultimately benefit patients?


There is a need to improve access to trainers and this inevitably means increasing their numbers. In the UK, the number of ‘training the trainers’ courses has increased but is still rather limited and does not match demand. There is a danger of rushing these trainers through the process without the necessary checks in place. This benefits no one, least of all the patients.


Having completed the accreditation process, like the rest of medicine, the learning process does not stop. Without a universally agreed method of maintaining accreditation across various POCUS programmes, there is naturally concern that once accreditation is gained, physicians fail to maintain their skillset for example due to a lack of time or inadequate exposure to clinical variety.


Accepting that publication bias exists, the literature is full of manuscripts which demonstrate that learning the skill of ultrasound is not difficult. Their conclusion is often along the lines of “it takes X months for a complete novice to learn and attain a 95% agreement rate with scans performed by experts”. Such feasibility studies often hint at the potential of ultrasound to improve patient outcomes (without being able to confirm this), further adding to the feeding frenzy of colleagues wanting to learn and develop POCUS skills.


Underlying all these training principles and crucial for future development is a matching governance structure. How images are stored, indexed, reported and reviewed all need to be planned before training programmes launch locally.


In summary, there is a variety of accreditation programmes available. They vary in:

  • The modules covered
  • What is actually required in the modules
  • How training is delivered – face-to-face vs distance learning
  • The number of scans/logbook requirements
  • The assessment process
  • The reaccreditation/maintenance of competency process


In conclusion, when learning and performing POCUS, self-awareness is crucial. Being aware of one’s own limitations and indeed, the limitations of the scan being performed is of paramount importance. Putting your hand up and admitting that you need help or more expert opinion is a sign of strength not weakness. With that awareness firmly in place, go out there and learn!




Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37(7):1077-83. Epub 2011 May 26 – International expert statement on training standards for critical care ultrasonography


United Kingdom’s Accreditation Programme, Syllabus and Logbook (FREE)

POCUS – http://www.ics.ac.uk/ics-homepage/accreditation-modules/cusic-accreditation/

ECHO – http://www.ics.ac.uk/ics-homepage/accreditation-modules/focused-intensive-care-echo-fice/


ESICM European Diploma in Echocardiography – http://www.esicm.org/education/edec

International consensus statement on training standards for advanced critical care echocardiography – http://icmjournal.esicm.org/Journals/abstract.html?doi=10.1007/s00134-014-3228-5


ACCP Critical Care Ultrasonography accreditation – http://www.chestnet.org/Education/Advanced-Clinical-Training/Certificate-of-Completion-Program/Critical-Care-Ultrasonography




There’s a better way of doing business

mandatory-training-bannerNo where is the benefit of re-engineering training processes better illustrated than in Advanced Life Support (ALS).


ALS is a component of national standards in most countries.  For example, in Australia, NSQHS standard 9.6 requires all acute care hospitals to have a clinical workforce that is able to respond appropriately to a deteriorating patient, including having a system in place to ensure access at all times to clinicians who can practice advanced life support.


Furthermore, it is a clear community expectation that hospital staff are prepared and able to resuscitate the victims of cardiac arrest.


In practice though, this standard is variably interpreted.  The Australian standards provide little or no guidance on what “appropriately trained” to provide ALS means.


Nonetheless, the default position is to complete an ALS course provided / sanctioned by the Australian Resuscitation Council (ARC).  This two day course combines pre-reading, lectures and seminars, part-task training, simulations and assessment (usually fact recall and a simple, unrealistic simulation).


While there is no mention of refreshment of skills and knowledge in the NSQHS standards, the ARC will allow you to undertake a 1 day refresher if you do so within 2 years of your last certification, or a repeat 2-day course every 4 years.


In essence, they forget about you for 4 years, and assume that you can execute on day 1460 as you did on day 1.


No account is taken of :

  • local policy
  • your clinical exposure
  • the equipment you use in your own environment

There is a better way.


We all learn best when we build on our past experience and knowledge.  Spaced learning is an evidence based strategy that allows learners to process the information they have been presented, and build on it over time.  Cramming all learning into a 2 day period flies directly in the face of this approach.


A far more appropriscreen-shot-2016-09-12-at-10-22-39-amate way to impart knowledge is to build a curriculum for continuous learning, teaching the basics, reinforcing them after a short period, and building further awareness.


Well constructed online learning platforms enable users to tailor their learning to their needs, accelerating through sections they are familiar with, while allowing them to explore areas they want to know more about.


Traditional ALS management does not take local context into account.  Recognition that awareness of local policies and procedures, and familiarity with the equipment that healthcare staff are expected to use in the heat of resuscitation is critical to forming a well prepared service.


Exposure to clinical activity is also essential.  There is no doubt that the capacity we would expect from someone who has not been involved in a resuscitation event for 2 years would be vastly different from someone who does it weekly, yet traditional models do not recognise this.


Healthcare professionals who use the logbook functionality to record their participation in real-world ALS events can monitor their activity and outcomes.  Using this data, Osler helps the user identify when they are lacking in exposure to clinical resuscitation.als3


Smart, technology based approaches to ensuring fitness-for-practice can be implemented to fill these gaps.


Simulation is a strategy used to help students learn to apply factual knowledge.  It includes interpretation of data, prioritisation, anticipation and planning and decision making.


Osler has developed a real world simulation online, allowing the learner to experience what it’s really like to run an ALS scenario.


These two strategies to preparation to participate in an ALS event could not be more polarised.  Who would you want looking after you if your heart stops?


Furthermore, by training in this way, real time completion data is maintained, ensuring that hospitals comply with national regulations.


Finally, the lack of portability of training acquired in healthcare environments leads to duplication of training (and enormous expense and waste), while in contrast assumptions are made regarding the capability of our staff based on where they have worked in the past.  Osler believes our credentials must be portable, visible, and above all, useful to individuals and managers alike.


Lambert Schuwirth : Where are we now with Competency Based Training in Healthcare?

Like so many other disciplines medical education has its vogues; topics, ideas, approaches that become popular so quickly and massively that some refer to them as ‘hypes’.


I must confess that I too think that some developments in medical education are hypes, and I cannot escape the impression that some are solutions in search of a problem. Actually I am not happy with such hypes because they make it more difficult to defend medical education as a serious and scientific discipline and do harm to my discipline.


Sometimes they even lead to very costly mistakes. The epitome of such a mistake has been the use of long branched scenario simulations for high-stakes, summative testing of clinical reasoning in the 1970s – 1980s. What seemed to be a good and valid approach turned out to be inherently flawed, but that was only found out after millions of dollars had been spent on projects and numerous candidates had been failed based on the use of faulty assessments. This, by the way, was nothing to do with a lack of computers, virtual reality or imagination power, but due to a lack of understanding of what clinical reasoning actually is, – in other words a lack of fundamental research. But this is long behind us and medical education has progressed as a discipline.


So where do we put the development of competency-based education then? Is it in the ‘hype’-basket or in the ‘genuine step forwards’ basket? I would argue that it is a step in the right direction but I would also argue that we are not there yet. Let me explain why I think this way.


Traditional Model

A typical traditional model of medical competence describes it as the combination of knowledge, skills, problem-solving ability and attitudes, assuming that each of them could be separately taught and separately assessed. Each of these four pillars was more or less seen as a personality trait: a relatively stable and relatively generic trait. So, for example, one could have good problem-solving ability and no knowledge or vice versa. In addition, it was assumed that for each of them there was one single best assessment method; vivas, for example, would best for clinical reasoning, OSCEs best for skills, etc.


Unfortunately, robust research findings do not support this intuitive model. Scores, for example, generalise much better within content across formats; one study showed that a knowledge test on skills correlated much better with an actual OSCE station (given the same content domain) that two different OSCE stations within the same examination. Numerous studies demonstrated that validity of an assessment is determined by what is asked, rather than whether it is an open-ended or multiple choice question. These findings and others made that the traditional model with ‘knowledge’, ‘skills’, ‘problem-solving ability’ and ‘attitudes’ was no longer tenable and a new and more integrative view on competence had to be developed; one that is based on the finding that content is important and not the format, one that seeks to combine information that is content-similar and not information that is format-similar.


In addition to this, medical curricula were often structured according to the healthcare organisation. In this structure, a competent graduate is seen as someone who has successfully completed all individual discipline-based examinations and successfully navigated all discipline-based clerkships, but without integration between them.


Of course, patients are not naturally ‘structured’ the way healthcare is structured; they may have undifferentiated complaints, combinations of morbidity, polypharmacy, etc.  So, here too there is a need to reorganise our curricula preferably with the patient as the integrative factor and not the organisation of secondary and tertiary healthcare.


Competency Based Learning

Competency-based education has been suggested as a way out of these dilemmas. It suggest that the structure of education and the definition of outcomes is better organised within content themes rather than a sort of personality traits putting the patients and the management of patients at the heart of the process. Competencies, although they have been defined in numerous ways, can generally be seen as simple to complex tasks a successful candidate must be able to handle and during which s/he uses at the right time the correct and relevant knowledge, skills, attitudes and meta-cognitions to manage the situation successfully.


Are we there yet? No, there is still considerable ground that has to be covered before we have truly competency-based education and some changes in thinking may need to take place.


The first is a change from a reductionist to a holistic view. Often our education and assessment deconstruct competence into little teachable and assessable pixels and then struggles to put them together to see the whole picture. That is why arbitrary decisions have to be made about cut-off scores, about how to combine assessment results, etc. In a more holistic approach the whole picture is described first and when needed the individual pixels can be inspected. For assessment this means that competence is not a combination of measurements, but a narrative in which measurement information is used. This is not unlike a patient’s condition: which is a narrative in which lab values can be used.


The second is a change from linear dynamic to more complex views. So many essential aspects of competence cannot be modelled in a one-size-fits all model. Feedback is not given exactly the same way to all students in all situation but is an adaptable and largely unpredictable process. The same applies to healthcare, our students should not be taught simple tricks. They are not expected to deal with every patient in every situation in exactly the same way, but need to be adaptable and agile as to cater to every – often unexpected – situation. No doctor can predict exactly what will be said 2 minutes and 45 seconds into the consultation but they can be pretty confident that their reaction will be correct and they will not overstep boundaries. Our education and assessment approaches will increasingly need to reflect this complexity/uncertainty.


The third is do away with our fear for qualitative information. In assessment, specifically, it is often thought that numbers are objective and therefore reliable. Even in research the typical quantitative article still contains many more words than numbers, because the words are needed to make sense of the numbers.  Even the often quoted Lord Kelvin who argues that “numbers or measurements are better than words” apparently needed words to make that claim (it was not a mathematical derivation but a rhetorical argument). I am not saying that we need to do away with numbers, quite the opposite. But, we do need to understand that the collection of data can happen objectively but any interpretation involves subjective human judgement, and we should not fear it. Is this a plea to be arbitrary and capricious with assessment and education? No it is not, as much as objectivity does not imply reliability nor does subjectivity imply unreliability, and much has been written about this.


So in summary, education and assessment need to be of high quality, transparent, meaningful and valid, and the development towards competency-based education and competency bases (programmatic) assessment are a step closer towards optimising the quality of the education of our future doctors.



About the author

Lambert Schuwirth is a Professor of Medical Education and Director Prideaux Centre for Research in Health Professions Education at Flinders University in Adelaide, Australia.


Clinical Skills Development

“How many of these have you done?”

A recent opinion piece in the Journal of the American Medical Association drew attention to the issue of procedural experience in healthcare.


Titled “How many have you done?”, the piece described the experience of a doctor who required a procedure herself, in this case, an amniocentesis.


Of course, what the patient was really asking is, “How can I be reassured you know what you are doing?”


The thrust of the piece was that the training doctor who performed the procedure had felt compelled to misrepresent their experience with the technique, deftly deflecting questions by the patient and her partner that explored his competence.  The author calls for a more honest response to these types of questions, while acknowledging that this is often difficult to do.


But is it any wonder a young doctor has trouble answering this question?


Healthcare continues to battle with the issue of competency.  It is still rare for doctors to be formally certified to perform specific procedures.  In fact, the industry still does not have a shared understanding of what competency actually is!


Furthermore, because it is uncommon for doctors to assiduously record their activity and outcome data, and even more rare for them to benchmark against their peers, most clinicians are simply oblivious to performance level.


So when patients are searching for reassurance that they will be cared for as best they can be, most of us struggle to be clear and meaningful in our response.  Because most of the time, we just don’t know.


Wouldn’t it be much better for the junior doctor to answer with authority?


“Well, I’ve completed a recognized pathway and been certified to practice after a period of supervision by experts.  Furthermore, I continuously review my performance results and feel comfortable that I’m doing well.”

Osler Community Survey – Results

Thanks to all the respondents to our first Osler Community Survey!  The response rate we received was highly encouraging, and the feedback quite insightful in relation to professional development, clinical governance, credentialing, logbooks and desirable Osler product features.   We look forward to putting the input and feedback received to good use in the next versions of Osler and we will keep you apprised of the upcoming launch of Osler Community later this year via our newsletter and here on the blog.


Congratulations to N Kumta (Australia), M Hoops (Australia) and DP Bowles (UK), our three winners of the free 12 month subscription to Osler!


The infographic below provides a summary of your feedback and our findings.

Osler Clinical Performance


Or, you can download it from the link below:

Osler Survey (June 16) Results

Our approach to clinical governance needs rethinking

Enough is enough

The June 2016 edition of Clinical Communique (a periodic report released by the Victorian Institute of Forensic Medicine) once again highlights the issues facing procedural healthcare.  The report highlights three recent coronial inquests into patients who succumbed to complications from central access devices, including a fatal myonecrosis, a pericardial tamponade and a carotid placement resulting in a stroke.  Multiple issues are highlighted in the insertion and subsequent management of these devices.


My problem with this is that we’ve heard it all before.  There is nothing new in these recommendations, yet the incidents keep happening.  And it’s far too simplistic to think of the clinical staff involved as “bad apples”.  Simple fact is, they are not.  They are hard working, intelligent, dedicated, diligent and well intentioned.  In fact, they’ll no doubt be completely traumatised by the experience.  So why does this keep happening?


Off the top of my head, let’s start with the following :
a) a lack of agreement on best approach (but there ARE existing guidelines)
b) failure to communicate guidelines effectively to those at the coalface
c) resistance by clinicians to embrace best available evidence
d) total lack of structured accreditation process for insertion of lines (and most other invasive procedures)
e) systemic failure to share learnings just like this on a wide enough scale


Surely its time our industry got its act together and did something meaningful to overcome these barriers.


If you’re interested, here’s the report


About the author

Dr Todd Fraser is an intensivist and retrieval physician, and the co-founder of the Osler Clinical Performance Platform, dedicated to improving certification and training in acute healthcare.

Development update

workforce 2 copyThe recent expansion of our Melbourne-based development team has allowed for significant progress in the past 6 months, culminating in the release of version 2.0 of our platform this month.

Version 2.0 includes important enhancements to key functionality such as procedure logging and the My Training section.

It also sees the release of our mobile enabled Assessments platform, enabling workplace based evaluation of clinical skills, procedural competence and equipment certification.  Structured assessments ensure that all staff are provided with objective and consistent feedback on their performance, improving skills and knowledge acquisition.


The Assessment framework supports many our purpose built training plans, such as :

  • Basic and Advance Life Support skills
  • Basic procedures
  • Certification in key equipment
  • Falls risk assessment and prevention

Assessments can also be custom built to suit local needs.

Combined with our mobile procedure supervision & evaluation, Osler Clinical Performance can enable vastly improved clinical governance and provide a real time solution to your credentialing & compliance requirements.

Osler Clinical Performance Version 2 is available for providers and institutions now.  If you’d like to trial Osler for 30 days, you can set up your own demo version AppExchange, or contact us at info@oslertechnology.com

1 2 3 4