top of page

HiQuiPs: Choosing Wisely Canada and Quality Improvement Part 2

Updated: Feb 27, 2022

Authors: Ahmed Taher and Davy Tawadrous


Choosing Wisely Canada Logo (source: https://choosingwiselycanada.org/wp-content/uploads/2017/02/2015-04-10_CWC-brandbook-04_pages.pdf)

Welcome to our second HiQuiPs post highlighting Choosing Wisely, an international movement dedicated to reducing overuse in health care. Variation among providers and organizations in tests and procedures can help to identify areas of overuse and strategies to reduce it. In our first post, we discussed sources of variation in healthcare and specifically how variation can be an indicator of unnecessary tests and/or procedures.

Through the publication of specialty-specific lists of recommendations, Choosing Wisely campaigns have spurred many local, national, and international QI initiatives, including many in emergency medicine. Today, we’ll do a deep dive to examine how the St. Michael’s Hospital’s Emergency Department, a trauma, stroke, and cardiac intervention center in Toronto, Canada utilized QI methodology to successfully and safely reduce unnecessary coagulation testing(1)​. You will recognize some key features of the QI methodology illustrated in this example that we have discussed in our series.


SMART Aims Make for Smart Projects

While frequently performed, coagulation testing in the emergency department has specific indications that are well established​2​. However, Fralick et al. (2017) found that many providers did not understand the indications for testing and as such, frequently ordered coagulation testing inappropriately​1​. While it is unclear how this specific gap was identified locally (i.e. formal root cause analysis), they set out with the aim:


“To reduce PT/INR and aPTT blood testing in the emergency department (ED) over a 12-month period using an iterative quality improvement strategy.”


Reflecting on our previous discussion of SMART Aim statements, how does this AIM statement fare? Is it:

  • Specific – while the research team did specify the test of interest and setting, identifying a target reduction would have improved the specificity of their statement (and possibly motivated their team to reach that target).

  • Measurable – the aim is measurable by means of test utilization rates before and after each intervention.

  • Actionable – the aim is within their team’s (ED physicians, ED nurses, laboratory staff, hospital administrators) scope of influence.

  • Realistic – this is tied to the specificity of the aim, in that clearly outlining a very specific aim (i.e. who, what, when, by how much) is required to judge how realistic that aim is. Another possibility is considering previous literature for the expected effect size.

  • Time-defined – the aim is well-defined with a specific timeline of 12 months.

Overall, Fralick et al. generated a great SMART aim that could have benefited from increased specificity on the degree of expected reduction (e.g. an absolute reduction of 20%)​ (1).


A Comprehensive Picture of Change

Once the project team identified a problem and clearly outlined a SMART Aim, a critical next step is to establish a comprehensive family of measures. A family of measures includes:

  1. Primary measures (or outcomes) that identify whether an intervention has improved the metric of interest. In this study, the primary measures were the change in the weekly rate of PT/INR and aPTT testing per 100 ED patients and any associated cost savings.

  2. Process measures help determine whether our intended change in fact led to improvement in our primary outcome. The authors listed the rates of INR, aPTT and creatinine testing, as well as patient volume data. Further process measures could potentially include the frequency of downstream coagulation testing for patients.

  3. Balancing measures highlight if improvements in the primary measures cause other unintended changes in the system, in order to avoid improvements at the expense of other variables. Balancing measures in this study include the rate of patients receiving blood product transfusions (i.e., patient harm incurred as a result of not performing coagulation testing).


Try, Then Try Again

At this stage, the team can begin to perform sequential iterative Plan Do Study Act (PDSA) cycles in an attempt to improve their primary outcome. Fralick et al. describe three PDSA cycles:


PDSA Cycle 1: Aim: Determine why PT/INR and aPTT testing was being frequently ordered in the ED. Hypothesis: Unfamiliarity with coagulation testing indications was a major contributor. Intervention: Provide ED physicians with educational materials. Outcome: No change.


PDSA Cycle 2: Aim: Understand how coagulation tests were being performed in the laboratory. Hypothesis: Providers frequently ordered testing “unknowingly” as testing was automatically included in order sets and coupled at the backend via laboratory software. Intervention: Uncoupling coagulation testing at the back-end via laboratory software. Outcome: Little impact on the rate of coagulation testing.


PDSA Cycle 3: Aim: Understand why coagulation testing was unchanged even after uncoupling. Hypothesis: Order sets were not revised and thus both tests were still being automatically ordered at the front-end. Intervention: Coagulation testing was removed from five order sets. Outcome: Significant reduction in testing rates (half!), which amounts to $4,680 (USD) in direct cost savings per month.


When reviewing the different PDSA cycles, a question might arise about the effectiveness of different interventions as Fralick et al. used multiple strategies. The hierarchy of effectiveness for QI interventions places educational strategies as generally less effective and forcing functions (i.e., backend uncoupling, removing testing from order sets) as more effective, however, many use multiple approaches as they are often synergistic. Fralick et al. notice similar findings in their interventions​. (1)


Laboratory testing of PT/aPTT and Creatinine in the St. Michael’s Hospital Emergency Department (source: Fraclick et al., https://bmjopenquality.bmj.com/content/6/1/u221651.w8161)

As discussed in our previous posts on run charts (1 & 2) the baseline period needs a minimum of 10 stable data points​3,4​. The authors show a baseline period of 13 weeks, which includes over 8,000 patients. The following PDSAs collectively spanned 42 weeks. The authors, however, did not directly note when the educational strategy was specifically started. Visually, a marked change is observable at week zero. However, probability-based rules could have been used on the run chart to further demonstrate statistically significant changes. Check out our previous post on run charts for more information.


Summary

Overall, Fralick and colleagues successfully used QI methodology to reduce overuse in the emergency department; significantly reducing coagulation testing which resulted in significant cost savings without evidence of patient harm​(1)​. When one considers their own setting– what recommendations can you implement to improve the quality of care provided to patients locally? Are there tests or treatments that are over, under, or inappropriately utilized? There are many emergency-specific recommendations and toolkits that have been previously published and recently updated by Choosing Wisely Canada; these references by Cheng et al in CJEM 2017​5​ and 2019​6​ are a great place for ideas on where to start!


Special thanks to Karen Born and Stephanie Callan from Choosing Wisely Canada for their input.


Senior Editor: Lucas Chartier Copyedit by: Mark Hewitt



References

  1. Fralick M, Hicks L, Chaudhry H, et al. REDucing Unnecessary Coagulation Testing in the Emergency Department (REDUCED). BMJ Qual Improv Rep. 2017;6(1). doi:10.1136/bmjquality.u221651.w8161

  2. Zehnder J. Clinical Use of Coagulation Tests. UptoDate. https://www.uptodate.com/contents/clinical-use-of-coagulation-tests. Accessed December 10, 2019.

  3. Taher A, Choi J. Reporting Run Charts Part 1. Canadiem-HiQuips. https://canadiem.org/reporting-qi-results-run-charts/. Published November 10, 2019. Accessed December 15, 2019.

  4. Atlin C, Taher A. Reporting Run Charts Part 2. Canadiem – HiQuips. https://canadiem.org/reporting-qi-results-part-2-run-charts/. Published December 11, 2019. Accessed December 15, 2019.

  5. Cheng A, Campbell S, Chartier L, et al. Choosing Wisely Canada®: Five tests, procedures and treatments to question in Emergency Medicine. CJEM. 2017;19(S2):S9-S17. doi:10.1017/cem.2017.1

  6. Cheng A, Campbell S, Chartier L, et al. Choosing Wisely Canada’s emergency medicine recommendations: Time for a revision. CJEM. 2019;21(6):717-720. doi:10.1017/cem.2019.405

bottom of page