Osler receives Ignite Ideas grant for Patient Feedback platform

Osler Technology has recently been granted funding to complete it’s Patient Feedback portal.


The Advance Queensland Ignite Ideas program awarded Osler Technology $100,000 to add the important metric to it’s ever-expanding suite of Professional Development Tools.


“Patient feedback is critical to the ongoing professional development of healthcare staff, and we’re delighted that the Queensland Government has supported the commercialisation of this feature,” says Osler co-founder, Dr Todd Fraser.


“Not only does patient feedback align well with other sources such as peer review, but it provides otherwise invisible information on the experience of our patients.  There is good evidence that structured feedback from patients has a significant impact on individual clinicians.”


The system will enable clinicians to provide their patients with a means to give them structured, secure feedback that is designed solely for the purpose of personal improvement.  The collated data becomes part of their performance portfolio and development plan.


In addition to the invaluable impact on self management of performance, Dr Fraser says it can contribute to registration requirements.


“Many colleges and boards recognise patient surveys as suitable for Continuing Professional Development programs, so this is a very easy and effective way for doctors to comply with their registration requirements.


“It beautifully complements other CPD activities that our users can undertake, including self auditing of procedures, peer review assessments, interactive simulations and online learning modules.”


The new Patient Feedback Portal is due for completion in early 2018.

Here’s how Osler is helping junior doctor training

“The (current) assessment process is largely focussed on identifying the very few instances of serious underperformance, and provides little meaningful feedback for the majority.” 

Review of Medical Intern Training

The Australian Taxpayer makes a considerable investment in the future of the healthcare system.  Nationally, over $300M is invested per year to train interns (Australian Healthcare Ministers’ Advisory Committee, 2015), yet there is no qualitative or quantitative evidence that they currently meet the training objectives set for them.


In 2006 The Australian Curriculum Framework for Junior Doctors (Confederation of Postgraduate Medical Education Councils, 2012) was developed to provide structure to the learning and development of the next generation of clinicians.  The ACFJD focused on the core knowledge, behaviours and practical skills in which a doctor should have achieved competency by the end of their second post-graduate year of practice. However, the ACFJD was released without mechanisms to deliver training or evaluation outcomes, and is widely been regarded as failing to meet its objectives.


In April 2014, the COAG Australian Health Ministers Advisory Council (AHMAC) commissioned the Review of Medical Intern Training (Council of Australian Governments, 2015) in response to increasing medical graduate numbers, and concern regarding the system’s capacity to train them within existing constraints.


The review found that despite the well-recognised variability in the skills of medical graduates, the “consumers” of Australia’s medical school product (the public health services) are unable to validate the work readiness of their new employees [National Intern Work Readiness Forum (Australian Healthcare Ministers’ Advisory Committee, 2016)].  Consequently, employers set unreasonably low expectations of their staff, resulting in underutilisation of the workforce, reduced productivity, disenchantment of interns and inefficient training.


Conversely, many interns continue to feel under-prepared at graduation.  National surveys indicate that up to 44% of interns feel ill-prepared to perform basic procedures when they enter the workforce, leaving them prone to poorly perform tasks on patients, leading to error, injury, cost and distress for both patient and doctor alike.


Universities meanwhile continue to struggle to measure the quality of their graduates once in the workforce, significantly impairing their ability to calibrate the quality of their undergraduate programs.


The AHMAC report, and the National Intern Work Readiness Forum that followed it, have recommended fundamental changes be implemented, including :

  • Defining the measurable competencies of a work-ready graduate
  • Focusing on maximizing compliance with these competencies
  • Development of robust assessment frameworks, based on the Entrustable Professional Attributes model, to assess work readiness
  • Improving flow of information between university programs, regulators and employers
  • Facilitating a philosophy of individual accountability for learning and development
  • Supporting the expansion of training into non-traditional environments such as private practice.

The report called for research and development to support the change process, including delivery and assessment vehicles for the new framework.

Implementing the Osler platform would immediately address many of these recommendations.


Box One : Abridged recommendations from the review of Medical Intern Training

Recommendation one : Internship should be changed to (abridged) :

  • Require demonstration of specific capabilities and performance, within the time based model
  • Ensure robust assessment of capabilities and feedback on performance
  • Ensure doctors in training have sufficient responsibility, under supervision, to develop competence and confidence while maintaining patient safety
  • Enable and require a philosophy of individual accountability for learning


Recommendation two :

  • That internship should have entry requirements that reflect agreed and defined expectations of work-readiness that graduates must meet before commencing.


Recommendation three :

  • Evaluation of different models of capability assessment, including [impacts on] resource requirements
  • Evaluation of options for an e-portfolio to provide greater individual accountability for learning and support for the assessment process
  • Examination of the capacity to assess and certify the capabilities and performance required for general registration within university programs

Recommendation seven :

  • Provision of dedicated, time-limited support for local innovation initiatives that have the potential to create sustainable improvements in the training experience



Junior doctors trial app to enhance education

The Osler Clinical Performance Platform is now being trialled with junior doctors at Mackay Base Hospital, aimed at further enhancing their educational experience.


Osler is an online portal that provides a consistent framework for recording, analysing, monitoring and benchmarking the performance of clinical staff.


Executive Director Research and Innovation Associate Professor Dr David Farlow said Mackay Base Hospital is the only Queensland Health facility to trial the platform with junior doctors.


“We recently launched the trial of Osler with our Emergency Department and Medicine Interns, PHOs and Registrars,” Dr Farlow said.


“Through the online portal we will keep track of their skills, procedures and outcomes while gaining input on what the program looks like and how it could be implemented.


“Investigating programs like Osler is all part of our bigger move towards online and self- directed learning to help cater to the educational needs of our junior doctors.”


Doctors continually train to be proficient in the procedures they perform and technologies such as Osler support the changing environments in healthcare and education.


“Working with our other education channels Osler will provide the tools for them to confidently perform procedures while ensuring the continued safety of our patients,” he said.


“Our patients can have confidence in knowing the doctor treating them has been trained, credentialed and monitored in the procedure they are performing.”


Co-founder of Osler, Intensive Care specialist Dr Todd Fraser, says the app helps overcome many of the challenges for doctors learning new procedures.


“The healthcare environment is a challenging one to teach in,” Dr Fraser said.


“Medicine is a non-stop business, there are 24 hour rosters to deal with, and many procedures can only be learned when opportunities arise.


“Add to that the fact that many junior doctors frequently migrate between hospitals, and it becomes difficult to track what their competencies are.


“By providing a digital, standardised portfolio of their skills training, Osler helps to keep them, and their patients, safe.”


Bond students set to benefit from Osler platform

Health sciences students, including medical, physiotherapy and allied health disciplines, are set become the next cohort of Osler platform users.


Osler Compliance helps the Faculty manage student compliance with essential tasks such as immunisation status, registration and first aid certification, prior to the commencement of clinical rotations.


Faculty Business Director Ms Rhonda Morton said the rollout was the first of several Osler components, including a placement management app, and in-workplace assessments that will facilitate more streamlined supervision and feedback for their students.

Ramsay’s Noosa Hospital receives award for implementing Osler

Noosa Hospital’s Education department has been awarded a Ramsay Healthcare national innovation in education award for its implementation of the Osler platform across its clinical staff.


The annual award recognises one of the earliest adopters of Osler’s pioneering platform, who are using the platform to add consistency to workplace assessment of their staff to maximise patient safety.  In doing so, staff are more aware of their scope of practice, according to Director of Clinical Services, Judy Beazley


The rollout of Osler was recently launched at a media event attended by Queensland Minister for Innovation, Information Technology and Science, Ms Leeanne Enoch.

Noosa Hospital is taking part in a world first trial using a new app to help improve the skills of doctors and nurses.Download the 7 News app: http://yhoo.it/2a8SxYV#7News

Posted by 7 News Sunshine Coast on Wednesday, April 12, 2017

Guest blog post : Show me the evidence, by Dr Rob Hackett

This week, the blog features a guest post by Dr Rob Hackett


If you went skydiving, would you first ask for scientific evidence from a randomized trial that a properly functioning parachute prevents injury before you’d consider using one during your freefall? Probably not.


In fact, no such study exists. Of course, some people without a parachute have survived a freefall from extraordinary heights without injury, and others have sustained injuries even when using a parachute. But it’s clear that you’d use a parachute when skydiving, even without a single randomized trial to prove its effectiveness. Yet, when it comes to medicine, clinicians may be reluctant to employ any intervention without rigorous scientific evidence for its efficacy.


The need for sound evidence evolved from a history of medicine that’s littered with practices that were later abandoned after scientific scrutiny showed that they were ineffective, perhaps even harmful.


As such, we are among the many who would agree with the general concept of “evidence-based medicine.” However, when it comes to patient safety, there are significant obstacles to this approach. (see here)


Because Evidence Based Medicine (EBM) places all of the emphasis on clinical trials, it forgets to ask the first, most basic question of all: does the idea make sense? Through this EBM can be used inappropriately as a tool to maintain the status quo. Evidence Based Medicine was never designed to assess the plausibility of basic science e.g. the effects of gravity on someone jumping out of a plane.


Remember, science helps us understand how things are. A more beneficial way to consider evidence is through the framework of Science Based Medicine (see here). Science Based Medicine’s philosophy is that we ought to consider prior probability and plausibility from basic science when determining if a claim is real enough to study. If the claim passes this test, then it should be studied with proper, well-executed clinical research. However in delaying the adoption of obviously better interventions, while waiting for clinical research, we may leave patients at unnecessary risk (see here).


Consider too, is it really the evidence from multicentre randomised controlled trials (RCTs) which ultimately lead us to adopt a better practice?


Take the example of ultrasound use when inserting central lines – many doctors had honed their skills over years, inserting central lines without the need for ultrasound. Some were then reluctant to use ultrasound when it became available – they hadn’t needed it before so why should they use it now?


What was the level of evidence which led to change in practice? Was it a multicentre RCT or was it a collection of other influential issues:

– The basic scientific notion that it’s better to be able to see what one is doing
– A series of heuristic stories of patients succumbing from complications which may have otherwise been avoided*
– Policies introduced obligating the use of ultrasound when it is available (see here)
– A gradual change in culture, and an adoption by sufficient numbers such that those not using ultrasound perhaps begin to feel estranged
– Retirement of those unwilling to change, those who stoically prevent change no matter what, those who Dr Jonn Hinds may have labelled as ‘Resus Wankers‘.


We do not live or practise in a laboratory, nor within the boundaries of double-blind, placebo-controlled trials. We live in a real world with patients who are also people. Intuition, clinical experience, and pathophysiologic rationale are indeed important tools, along with evidence-based literature, with which to discern the best care for our patients. To honour such a breadth of perspective, however, requires us to loosen our tenacious grip of currently accepted doctrines of EBM as the definitive measure of good clinical practice. For in the end, it is really our common sense, nurtured by education, experience, intuition, and rationale, that is always our ultimate measure of evidence—in medicine as in life itself. (See Here)


When better (in accordance with basic science) interventions  become available we should perhaps be more ready to adopt them.


Certainly lack of evidence must not be used inappropriately to stoically defend the status quo – for in the end it is our patients who suffer.



*It is with interest to note that one of the main presenters of NAP4 data routinely uses a video laryngoscope for intubation. Perhaps he has been influenced by an understanding of Science Based Medicine and exposure to numerous stories of patient demise from difficult tracheal intubation.


About the Author

Dr Rob Hackett is a senior anaesthetist from Sydney Australia.  He is a passionate campaigner for patient safety, and the author of the Patient Safe blog

Case Study – how a junior doctor is using Osler to set themselves apart

“Osler has helped me better understand what I’ve done, and where I need to focus my attention going forward.”

Dr Abby McArthur sits back and looks at her activity charts on Osler with a look of contemplation.  “I’ve completed my certification in internal jugular central lines, but it would be nice to do a few more subclavians.”

Abby, a first year trainee in Emergency Medicine and Intensive Care, has been using the Osler platform to track the procedures she performs, her success rates, and her complications.  She’s followed her progress in her journey through three separate hospitals now.

“I was having some trouble early on with my PICCs.  I was struggling to get them in at first, but I changed a few things and my success rate is much better now.”

This improved visibility of Abby’s outcome data has helped her in other ways too.

“I’ll be applying for new training positions for the new year, and I’m going to use the data I collect on Osler to demonstrate the certifications I’ve completed, and my activity data, to help build my resume.”

In fact, Osler has already helped Abby get noticed.

“I was told the CEO of one hospital I worked in complimented me on how much I’d achieved there during my rotation.  Apparently that doesn’t happen very often!”

“Having access to Osler at the bedside is very important.  It means that I can easily record all my procedures right after I do them.

Also, because I’m learning new procedures at this stage of my career, my supervisors can use Osler to give me instant feedback that I can use to improve my technique for next time.”

Osler’s groundbreaking cloud based assessment framework means Abby can capture and access her data wherever she goes, making it as easy as it can be to monitor her progress.

Osler Community goes live!

Individual doctors, nurses and paramedics at all stages of their careers are now able to take advantage of Osler’s unique performance development tools, with the release of Osler Community.


Released this week, Osler Community allows users to learn new skills, invite assessors to review their skills and provide feedback, collect certifications and record their procedural activity.  Powerful analytical tools built into the Osler application can help them understand their own performance like never before.


And all this is built on a global collaborative platform that helps you connect with your peers around the world.


For more information on the Community Edition, head over to http://osler.community

Hospital Safety Report Provides Opportunity for Innovation

A new report into managing critical safety risks in the hospital system makes important recommendations and highlights the need to prevent patient injury and death in the first place, says a high-tech medical start up.


The just-released Duckett review into DHHS management of incidents, such as a spate of perinatal deaths in Victoria in 2013-4, underlines the ongoing national tragedy of 1800 Australians who die each year and another 6800 who are impacted by such adverse events.


Osler Technology, the brain child of intensive care and retrieval medicine specialist Dr Todd Fraser, provides real time, meaningful data on patient outcomes within the clinical workspace, and enables clinician leaders to respond rapidly and creatively to prevent future events.


By tracking individual training and activity data, it ensures all staff practice within their scope.


“The recommendations of the Duckett review create a better environment for patient safety across the wider health system but this is only half the battle. Left unanswered is how health systems prevent patient injury in the first place,” said Dr Fraser.


“If meaningful change is to be made, clinical staff will be the ones to make it, and they can only do so with the right tools and information at their disposal. This is where a system like Osler plays a role by taking the high level principles identified in the Duckett review and applying them at individual patient, clinical manager and clinician levels.”


Read the full media release here:


Training in Ultrasound – In the kingdom of the blind, the one-eyed man is King

In this special guest blog post, Dr Adrian Wong discusses the challenges surrounding the introduction and implementation of a new technology in healthcare, bedside ultrasonography.



The application of ultrasound beyond the realms of the Radiology department is well and truly established. Ultrasound has evolved into an indispensable tool in the physicians’ armament – providing diagnostic, monitoring and procedural guidance within a neat package. Acute physicians, ED doctors, anaesthetists and critical care physicians have all embraced ultrasound as an essential part of their role.


The key to utilizing ultrasound successfully, in the hands of such a diverse group of specialties, lies in asking the right questions. Hence the development of focused examinations. Focused echocardiography is probably the best example of the use of ultrasound, permitting non-cardiologists to answer questions immediately relevant to their area of practice. The ability to confidently rule out (or in) pericardial tamponade in cardiac arrest or pneumothorax in trauma is pivotal in patient management. A word of caution though, as POCUS examinations are usually performed in a time-sensitive environment, getting it wrong can have significant repercussions. Urban legends such as a patient being thrombolysed because the LV was mistaken for the RV or a ‘leaking AAA’ taken straight to theatre only to reveal a normal caliber aorta are whispered in corridors as a reminder to the budding POCUSologist.


The proven clinical benefits of point-of-care ultrasonography has led to ongoing expansion of its role into uncharted areas. Whilst obviously exciting, this raises the issue of training and competency (to perform, interpret and act upon results). How best to become competent in ultrasonography makes for interesting and sometimes divisive conversation.


Reflecting on personal experience, my interest in POCUS coincided with the launch of CUSIC (Core Ultrasound Skills in Intensive Care), the UK’s own POCUS programme. A handful of centres in the UK offered fellowships with qualified trainers and suitable training opportunities. Apart from the guidance of experienced colleagues, my training was supplemented with online FOAMed resources. Videos recorded and shared (available free of charge) by esteemed teams of individuals such as @5minsono and @ultrasoundpod were instrumental in my professional development. Since then, the number of courses and fellowships available have continued to expand. I now help run our department’s POCUS fellowship and hence the issue of training is never far from my mind.


When one considers all the possible modules under the umbrella term of point-of-care ultrasound (POCUS) e.g. echocardiography, abdominal, etc. the concept of training and competency becomes even more nebulous. There are numerous POCUS accreditation programmes available from a variety of bodies. The ACCP, ESICM and ICS (UK) have developed their own programmes, all of which contain some overlapping similarities. As adult learners have different styles of learning, there is no single best way to learn the skill of POCUS.


As an example, the BSE (British Society of Echocardiography) accreditation for critical care requires a theoretical and practical examination with a logbook of 250 appropriate cases. In contrast, FICE (Focused Intensive Care Echocardiography) accreditation requires attendance at a course, a logbook of 50 cases and a triggered assessment. These two accreditations obviously differ in their resulting skillsets and breadth of clinical scanning experience, but this highlights the variation in training requirements for the module of echocardiography in critical care. Furthermore, BSE requires a regular logbook of cases to maintain accreditation, whereas no formal processes are currently in place to maintain FICE accreditation. In practice, any clinician with BSE or FICE accreditation is able to perform day-to-day echocardiography in an intensive care setting (although awareness of one’s own limitations is crucial in more complex cases).


Generally speaking, all the accreditation programmes are divided into theoretical knowledge and practical skills.


The theoretical component is generally comprised of basic physics, anatomy and a description of textbook views and pathology. A diagnostic algorithm is also introduced. This component can be delivered online or in person at courses. Each has its advantages and drawbacks.


Arguably the more important component of training is the practical aspect. This is where a face-to-face course/apprenticeship provides a useful starting point. Having an expert guide you through the scan – how to position the patient, adjust probe orientation etc is invaluable.


After the course, the accreditation systems available diverge. The UK’s POCUS accreditation (CUSIC) requires a specified number of scans in the presence of a mentor. It is the expectation of this programme that the supervised scans are performed until a minimum number has been achieved, thereafter triggering an assessment phase. Such an apprenticeship model is labour-intensive compared to other programmes which perhaps requires the uploading of scans onto an online logbook for review.


A recent survey of ESICM members showed that the main barrier to attaining POCUS training is the lack of trainers (personal communication). If there are insufficient trainers, the rollout of training and assessment will be limited / delayed. As mentioned above, an online platform (with minimal face-to-face interaction) is certainly one way of tackling the problem of a limited trainer base. But does this approach cheat the learner of the invaluable mentorship process? How do we ensure that the end product is a confident and competent physician which will ultimately benefit patients?


There is a need to improve access to trainers and this inevitably means increasing their numbers. In the UK, the number of ‘training the trainers’ courses has increased but is still rather limited and does not match demand. There is a danger of rushing these trainers through the process without the necessary checks in place. This benefits no one, least of all the patients.


Having completed the accreditation process, like the rest of medicine, the learning process does not stop. Without a universally agreed method of maintaining accreditation across various POCUS programmes, there is naturally concern that once accreditation is gained, physicians fail to maintain their skillset for example due to a lack of time or inadequate exposure to clinical variety.


Accepting that publication bias exists, the literature is full of manuscripts which demonstrate that learning the skill of ultrasound is not difficult. Their conclusion is often along the lines of “it takes X months for a complete novice to learn and attain a 95% agreement rate with scans performed by experts”. Such feasibility studies often hint at the potential of ultrasound to improve patient outcomes (without being able to confirm this), further adding to the feeding frenzy of colleagues wanting to learn and develop POCUS skills.


Underlying all these training principles and crucial for future development is a matching governance structure. How images are stored, indexed, reported and reviewed all need to be planned before training programmes launch locally.


In summary, there is a variety of accreditation programmes available. They vary in:

  • The modules covered
  • What is actually required in the modules
  • How training is delivered – face-to-face vs distance learning
  • The number of scans/logbook requirements
  • The assessment process
  • The reaccreditation/maintenance of competency process


In conclusion, when learning and performing POCUS, self-awareness is crucial. Being aware of one’s own limitations and indeed, the limitations of the scan being performed is of paramount importance. Putting your hand up and admitting that you need help or more expert opinion is a sign of strength not weakness. With that awareness firmly in place, go out there and learn!




Expert Round Table on Ultrasound in ICU. Intensive Care Med. 2011 Jul;37(7):1077-83. Epub 2011 May 26 – International expert statement on training standards for critical care ultrasonography


United Kingdom’s Accreditation Programme, Syllabus and Logbook (FREE)

POCUS – http://www.ics.ac.uk/ics-homepage/accreditation-modules/cusic-accreditation/

ECHO – http://www.ics.ac.uk/ics-homepage/accreditation-modules/focused-intensive-care-echo-fice/


ESICM European Diploma in Echocardiography – http://www.esicm.org/education/edec

International consensus statement on training standards for advanced critical care echocardiography – http://icmjournal.esicm.org/Journals/abstract.html?doi=10.1007/s00134-014-3228-5


ACCP Critical Care Ultrasonography accreditation – http://www.chestnet.org/Education/Advanced-Clinical-Training/Certificate-of-Completion-Program/Critical-Care-Ultrasonography