How do the diagnostic reasoning styles of UK medical and physician associate students compare? An exploratory study using an online patient simulation tool

Talk Code: 
7A.5
Presenter: 
Ruth Plackett
Twitter: 
Co-authors: 
Alistair Thorpe, Angelos Kassianos, Maria Kambouri, Jessica Sheringham
Author institutions: 
Department of primary Care and Population Health University College London, Department of Applied Health Research University College London, Institute of Education University College London

Problem

Primary care teams encompass a wide range of disciplines, many of whom have a role in diagnosis. For example, in 2020, there were approximately 650 Physician Associates (PAs) in primary care networks, with numbers expected to rise substantially. Evidence indicates that successful integration of PAs in primary care relies on other professionals having confidence in their clinical reasoning skills, the thinking and decision-making processes of clinical practice, but there is little evidence of how PAs’ clinical reasoning skills compare to other professions. This exploratory study aimed to compare clinical reasoning styles of final year medical and physician associate students in a simulation setting.

Approach

Data from eCREST (electronic Clinical Reasoning Educational Simulation Tool), an online patient simulation educational tool designed to develop clinical reasoning skills were used. Between 2017 and 2021, four UK medical schools used eCREST with PA students and three schools with medical students. Students saw a simulated case of a 58-year-old female presenting with chest pain. They could ask up to 32 questions during the case, of which 20 were classified by a panel of GP registrars and GPs as essential for deriving informed diagnoses. We compared reasoning styles between PAs and medical students on: 1) the percentage considering lung cancer as a possible diagnosis (Odds Ratios[OR] with 95% confidence intervals[CIs] using Fisher’s exact test) 2) the percentage of essential questions asked (mean difference estimates[MDE] with 95%CI) , and 3) which essential questions were asked.

Findings

A total of 159 medical students and 54 PA students completed the case. We did not find evidence of differences between medical and PA students on the percentage of students who considered lung cancer as a possible diagnosis (75% vs. 81% respectively, OR=0.68, 95%CI[0.28 to 1.53], or on the percentage of essential questions asked M=70.13% ± 23.17 (14 questions) vs. M=73.24% ± 18.89 (15 questions), MDE=-3.11%, 95%CI[-9.38 to 3.15].. Physician associates (63%) appeared more likely than medical students (49%) to ask about how symptoms were affecting the patient (OR=1.76 95%CI[0.90 to 3.53]).

Consequences

Developing students clinical reasoning skills is critical for efforts to improve patient care and reduce diagnostic error in primary care. These results provide suggestive evidence that medical and PA students had similar clinical reasoning styles in terms of information seeking and diagnostic ideas when using eCREST. Comparing reasoning styles in qualified professionals in clinical settings could help understand its impact on patients.

Submitted by: 
Ruth Plackett
Funding acknowledgement: 
This research is part of the programme of Policy Research Unit in Cancer Awareness, Screening and Early Diagnosis. The Policy Research Unit in Cancer Awareness, Screening, and Early Diagnosis receives funding for a research programme from the Department of Health Policy Research Programme. The views expressed are those of the authors and do not necessarily reflect those of the NHS, the NIHR or the Department of Health and Social Care.