Osler receives Ignite Ideas grant for Patient Feedback platform

Osler Technology has recently been granted funding to complete it’s Patient Feedback portal.


The Advance Queensland Ignite Ideas program awarded Osler Technology $100,000 to add the important metric to it’s ever-expanding suite of Professional Development Tools.


“Patient feedback is critical to the ongoing professional development of healthcare staff, and we’re delighted that the Queensland Government has supported the commercialisation of this feature,” says Osler co-founder, Dr Todd Fraser.


“Not only does patient feedback align well with other sources such as peer review, but it provides otherwise invisible information on the experience of our patients.  There is good evidence that structured feedback from patients has a significant impact on individual clinicians.”


The system will enable clinicians to provide their patients with a means to give them structured, secure feedback that is designed solely for the purpose of personal improvement.  The collated data becomes part of their performance portfolio and development plan.


In addition to the invaluable impact on self management of performance, Dr Fraser says it can contribute to registration requirements.


“Many colleges and boards recognise patient surveys as suitable for Continuing Professional Development programs, so this is a very easy and effective way for doctors to comply with their registration requirements.


“It beautifully complements other CPD activities that our users can undertake, including self auditing of procedures, peer review assessments, interactive simulations and online learning modules.”


The new Patient Feedback Portal is due for completion in early 2018.

Here’s how Osler is helping junior doctor training

“The (current) assessment process is largely focussed on identifying the very few instances of serious underperformance, and provides little meaningful feedback for the majority.” 

Review of Medical Intern Training

The Australian Taxpayer makes a considerable investment in the future of the healthcare system.  Nationally, over $300M is invested per year to train interns (Australian Healthcare Ministers’ Advisory Committee, 2015), yet there is no qualitative or quantitative evidence that they currently meet the training objectives set for them.


In 2006 The Australian Curriculum Framework for Junior Doctors (Confederation of Postgraduate Medical Education Councils, 2012) was developed to provide structure to the learning and development of the next generation of clinicians.  The ACFJD focused on the core knowledge, behaviours and practical skills in which a doctor should have achieved competency by the end of their second post-graduate year of practice. However, the ACFJD was released without mechanisms to deliver training or evaluation outcomes, and is widely been regarded as failing to meet its objectives.


In April 2014, the COAG Australian Health Ministers Advisory Council (AHMAC) commissioned the Review of Medical Intern Training (Council of Australian Governments, 2015) in response to increasing medical graduate numbers, and concern regarding the system’s capacity to train them within existing constraints.


The review found that despite the well-recognised variability in the skills of medical graduates, the “consumers” of Australia’s medical school product (the public health services) are unable to validate the work readiness of their new employees [National Intern Work Readiness Forum (Australian Healthcare Ministers’ Advisory Committee, 2016)].  Consequently, employers set unreasonably low expectations of their staff, resulting in underutilisation of the workforce, reduced productivity, disenchantment of interns and inefficient training.


Conversely, many interns continue to feel under-prepared at graduation.  National surveys indicate that up to 44% of interns feel ill-prepared to perform basic procedures when they enter the workforce, leaving them prone to poorly perform tasks on patients, leading to error, injury, cost and distress for both patient and doctor alike.


Universities meanwhile continue to struggle to measure the quality of their graduates once in the workforce, significantly impairing their ability to calibrate the quality of their undergraduate programs.


The AHMAC report, and the National Intern Work Readiness Forum that followed it, have recommended fundamental changes be implemented, including :

  • Defining the measurable competencies of a work-ready graduate
  • Focusing on maximizing compliance with these competencies
  • Development of robust assessment frameworks, based on the Entrustable Professional Attributes model, to assess work readiness
  • Improving flow of information between university programs, regulators and employers
  • Facilitating a philosophy of individual accountability for learning and development
  • Supporting the expansion of training into non-traditional environments such as private practice.

The report called for research and development to support the change process, including delivery and assessment vehicles for the new framework.

Implementing the Osler platform would immediately address many of these recommendations.


Box One : Abridged recommendations from the review of Medical Intern Training

Recommendation one : Internship should be changed to (abridged) :

  • Require demonstration of specific capabilities and performance, within the time based model
  • Ensure robust assessment of capabilities and feedback on performance
  • Ensure doctors in training have sufficient responsibility, under supervision, to develop competence and confidence while maintaining patient safety
  • Enable and require a philosophy of individual accountability for learning


Recommendation two :

  • That internship should have entry requirements that reflect agreed and defined expectations of work-readiness that graduates must meet before commencing.


Recommendation three :

  • Evaluation of different models of capability assessment, including [impacts on] resource requirements
  • Evaluation of options for an e-portfolio to provide greater individual accountability for learning and support for the assessment process
  • Examination of the capacity to assess and certify the capabilities and performance required for general registration within university programs

Recommendation seven :

  • Provision of dedicated, time-limited support for local innovation initiatives that have the potential to create sustainable improvements in the training experience



Junior doctors trial app to enhance education

The Osler Clinical Performance Platform is now being trialled with junior doctors at Mackay Base Hospital, aimed at further enhancing their educational experience.


Osler is an online portal that provides a consistent framework for recording, analysing, monitoring and benchmarking the performance of clinical staff.


Executive Director Research and Innovation Associate Professor Dr David Farlow said Mackay Base Hospital is the only Queensland Health facility to trial the platform with junior doctors.


“We recently launched the trial of Osler with our Emergency Department and Medicine Interns, PHOs and Registrars,” Dr Farlow said.


“Through the online portal we will keep track of their skills, procedures and outcomes while gaining input on what the program looks like and how it could be implemented.


“Investigating programs like Osler is all part of our bigger move towards online and self- directed learning to help cater to the educational needs of our junior doctors.”


Doctors continually train to be proficient in the procedures they perform and technologies such as Osler support the changing environments in healthcare and education.


“Working with our other education channels Osler will provide the tools for them to confidently perform procedures while ensuring the continued safety of our patients,” he said.


“Our patients can have confidence in knowing the doctor treating them has been trained, credentialed and monitored in the procedure they are performing.”


Co-founder of Osler, Intensive Care specialist Dr Todd Fraser, says the app helps overcome many of the challenges for doctors learning new procedures.


“The healthcare environment is a challenging one to teach in,” Dr Fraser said.


“Medicine is a non-stop business, there are 24 hour rosters to deal with, and many procedures can only be learned when opportunities arise.


“Add to that the fact that many junior doctors frequently migrate between hospitals, and it becomes difficult to track what their competencies are.


“By providing a digital, standardised portfolio of their skills training, Osler helps to keep them, and their patients, safe.”


Bond students set to benefit from Osler platform

Health sciences students, including medical, physiotherapy and allied health disciplines, are set become the next cohort of Osler platform users.


Osler Compliance helps the Faculty manage student compliance with essential tasks such as immunisation status, registration and first aid certification, prior to the commencement of clinical rotations.


Faculty Business Director Ms Rhonda Morton said the rollout was the first of several Osler components, including a placement management app, and in-workplace assessments that will facilitate more streamlined supervision and feedback for their students.

Journal review modules – a new Osler feature

Staying abreast of the latest literature is traditionally very difficult for most practicing clinicians.


There is a seemingly endless array of journals and papers, some of which may or may not be relevant.  For the hardworking clinical staff performing day to day care for patients, it’s almost impossible to detect signal from noise.





Busy clinical staff just like you tell us they just don’t have time to review all the relevant literature – what they need is a brief review that highlights the key points, in an engaging, interactive and tailored format.


Osler is here to help.  Our recently launched Journal Module series review the most important literature in your specialty’s landscape.  Designed for clinicians with only limited time, the reviews give you the background, a brief overview of the trial, and how it applies in your daily practice.  Break out slides enable you to explore more detail if you wish.


And best of all, all the activity you perform can be captured so you can easily update your CPD program with your efforts.


So why not join Osler.community today and take advantage of the Osler platform.

Ramsay’s Noosa Hospital receives award for implementing Osler

Noosa Hospital’s Education department has been awarded a Ramsay Healthcare national innovation in education award for its implementation of the Osler platform across its clinical staff.


The annual award recognises one of the earliest adopters of Osler’s pioneering platform, who are using the platform to add consistency to workplace assessment of their staff to maximise patient safety.  In doing so, staff are more aware of their scope of practice, according to Director of Clinical Services, Judy Beazley


The rollout of Osler was recently launched at a media event attended by Queensland Minister for Innovation, Information Technology and Science, Ms Leeanne Enoch.

Noosa Hospital is taking part in a world first trial using a new app to help improve the skills of doctors and nurses.Download the 7 News app: http://yhoo.it/2a8SxYV#7News

Posted by 7 News Sunshine Coast on Wednesday, April 12, 2017

Guest blog post : Show me the evidence, by Dr Rob Hackett

This week, the blog features a guest post by Dr Rob Hackett


If you went skydiving, would you first ask for scientific evidence from a randomized trial that a properly functioning parachute prevents injury before you’d consider using one during your freefall? Probably not.


In fact, no such study exists. Of course, some people without a parachute have survived a freefall from extraordinary heights without injury, and others have sustained injuries even when using a parachute. But it’s clear that you’d use a parachute when skydiving, even without a single randomized trial to prove its effectiveness. Yet, when it comes to medicine, clinicians may be reluctant to employ any intervention without rigorous scientific evidence for its efficacy.


The need for sound evidence evolved from a history of medicine that’s littered with practices that were later abandoned after scientific scrutiny showed that they were ineffective, perhaps even harmful.


As such, we are among the many who would agree with the general concept of “evidence-based medicine.” However, when it comes to patient safety, there are significant obstacles to this approach. (see here)


Because Evidence Based Medicine (EBM) places all of the emphasis on clinical trials, it forgets to ask the first, most basic question of all: does the idea make sense? Through this EBM can be used inappropriately as a tool to maintain the status quo. Evidence Based Medicine was never designed to assess the plausibility of basic science e.g. the effects of gravity on someone jumping out of a plane.


Remember, science helps us understand how things are. A more beneficial way to consider evidence is through the framework of Science Based Medicine (see here). Science Based Medicine’s philosophy is that we ought to consider prior probability and plausibility from basic science when determining if a claim is real enough to study. If the claim passes this test, then it should be studied with proper, well-executed clinical research. However in delaying the adoption of obviously better interventions, while waiting for clinical research, we may leave patients at unnecessary risk (see here).


Consider too, is it really the evidence from multicentre randomised controlled trials (RCTs) which ultimately lead us to adopt a better practice?


Take the example of ultrasound use when inserting central lines – many doctors had honed their skills over years, inserting central lines without the need for ultrasound. Some were then reluctant to use ultrasound when it became available – they hadn’t needed it before so why should they use it now?


What was the level of evidence which led to change in practice? Was it a multicentre RCT or was it a collection of other influential issues:

– The basic scientific notion that it’s better to be able to see what one is doing
– A series of heuristic stories of patients succumbing from complications which may have otherwise been avoided*
– Policies introduced obligating the use of ultrasound when it is available (see here)
– A gradual change in culture, and an adoption by sufficient numbers such that those not using ultrasound perhaps begin to feel estranged
– Retirement of those unwilling to change, those who stoically prevent change no matter what, those who Dr Jonn Hinds may have labelled as ‘Resus Wankers‘.


We do not live or practise in a laboratory, nor within the boundaries of double-blind, placebo-controlled trials. We live in a real world with patients who are also people. Intuition, clinical experience, and pathophysiologic rationale are indeed important tools, along with evidence-based literature, with which to discern the best care for our patients. To honour such a breadth of perspective, however, requires us to loosen our tenacious grip of currently accepted doctrines of EBM as the definitive measure of good clinical practice. For in the end, it is really our common sense, nurtured by education, experience, intuition, and rationale, that is always our ultimate measure of evidence—in medicine as in life itself. (See Here)


When better (in accordance with basic science) interventions  become available we should perhaps be more ready to adopt them.


Certainly lack of evidence must not be used inappropriately to stoically defend the status quo – for in the end it is our patients who suffer.



*It is with interest to note that one of the main presenters of NAP4 data routinely uses a video laryngoscope for intubation. Perhaps he has been influenced by an understanding of Science Based Medicine and exposure to numerous stories of patient demise from difficult tracheal intubation.


About the Author

Dr Rob Hackett is a senior anaesthetist from Sydney Australia.  He is a passionate campaigner for patient safety, and the author of the Patient Safe blog

Case Study – how a junior doctor is using Osler to set themselves apart

“Osler has helped me better understand what I’ve done, and where I need to focus my attention going forward.”

Dr Abby McArthur sits back and looks at her activity charts on Osler with a look of contemplation.  “I’ve completed my certification in internal jugular central lines, but it would be nice to do a few more subclavians.”

Abby, a first year trainee in Emergency Medicine and Intensive Care, has been using the Osler platform to track the procedures she performs, her success rates, and her complications.  She’s followed her progress in her journey through three separate hospitals now.

“I was having some trouble early on with my PICCs.  I was struggling to get them in at first, but I changed a few things and my success rate is much better now.”

This improved visibility of Abby’s outcome data has helped her in other ways too.

“I’ll be applying for new training positions for the new year, and I’m going to use the data I collect on Osler to demonstrate the certifications I’ve completed, and my activity data, to help build my resume.”

In fact, Osler has already helped Abby get noticed.

“I was told the CEO of one hospital I worked in complimented me on how much I’d achieved there during my rotation.  Apparently that doesn’t happen very often!”

“Having access to Osler at the bedside is very important.  It means that I can easily record all my procedures right after I do them.

Also, because I’m learning new procedures at this stage of my career, my supervisors can use Osler to give me instant feedback that I can use to improve my technique for next time.”

Osler’s groundbreaking cloud based assessment framework means Abby can capture and access her data wherever she goes, making it as easy as it can be to monitor her progress.

Skills decline – a problem on the rise

In the 2-year course of our Osler journey, my business partner Jeff has said to many time who in the hospital he’d want looking after him if he needed a procedure performed : the senior registrar.


As Jeff sees it, senior registrars are about as sharp skills-wise as they are ever going to get.  They do the most procedures, they learned the most recently and they are yet to be cloaked by an air of invincibility.


And he’s not far off the mark.


But what it highlights is an increasingly recognised phenomenon – skills attrition in consultants.


In many procedural specialties, there is an almost precipitous drop off in exposure to invasive procedures from the day you pass your fellowship exam.  There is a changing of the guard, where those who once did, now supervise.  Add to that the competition for access to increasingly rare opportunities and there is little doubt that emergency physicians, retrievalists, rural generalists and intensivists are starved of exposure.


It’s more likely that the problem has been quietly suspected for some time, but as an industry we’ve been more inclined to turn a blind eye to it, for the solution presents an even bigger problem – if we’re all diminishing in our skills capacity, what on earth are we going to do about it?


But the problem is now becoming too big to ignore.  Andrew Tagg, an emergency physician from Melbourne, wrote about this recently.  Access to opportunities to perform procedures are becoming so rare that inevitably we are all deskilling.


So what to do?


The first step in any quality assurance process is to measure.  Any textbook on clinical audit will tell you the three key areas that we can measure – activity, process and outcome.


The first should be easy.  Documenting our activity is an important first step in detecting gaps in our experience.  There is a fairly clear relationship between recency of performance and ability to execute, so it makes sense to track the volume and timing of our activity.


The second examines our method.  Is it really too much to ask to submit ourselves to periodic review of our performance by our peers? Is there a better way to validate that my practice is consistent with modern standards?  While inevitably there are logistical challenges with this style of approach, the potential benefit in safety terms more than justifies applying it.


Finally, and most problematic, is to measure outcomes.  It’s difficult for many reasons, not the least of which are standardising definitions, accurate data collection (particularly of delayed outcomes) and the relatively low incidence of complications for most things we do.


We should not refuse to measure ourselves because we are afraid of what it might tell us.  The more mature response is to find out where our limitations lie, and find a solution.


We owe that much to our patients.


The old adage is that “Not all that is important can be measured, and not all that can be measured is important.” However, there is plenty that can be measured and is of value to us.


We owed it to our patients to try.

Clinical Skills Development

Osler Community goes live!

Individual doctors, nurses and paramedics at all stages of their careers are now able to take advantage of Osler’s unique performance development tools, with the release of Osler Community.


Released this week, Osler Community allows users to learn new skills, invite assessors to review their skills and provide feedback, collect certifications and record their procedural activity.  Powerful analytical tools built into the Osler application can help them understand their own performance like never before.


And all this is built on a global collaborative platform that helps you connect with your peers around the world.


For more information on the Community Edition, head over to http://osler.community

1 2 3 4