Limits...
The ABCs of DKA: Development and Validation of a Computer-Based Simulator and Scoring System.

Yu CH, Straus S, Brydges R - J Gen Intern Med (2015)

Bottom Line: Participants' scores showed a significant effect of training level (p < 0.001).Scores also correlated with the number of DKA patients they reported treating, weeks on Medicine rotation, and comfort with managing DKA.Our evidence suggests that it can be used for formative assessment of trainees' DKA management skills.

View Article: PubMed Central - PubMed

Affiliation: St. Michael's Hospital, Toronto, ON, USA, yuca@smh.ca.

ABSTRACT

Background: Clinical management of diabetic ketoacidosis (DKA) continues to be suboptimal; simulation-based training may bridge this gap and is particularly applicable to teaching DKA management skills given it enables learning of basic knowledge, as well as clinical reasoning and patient management skills.

Objectives: 1) To develop, test, and refine a computer-based simulator of DKA management; 2) to collect validity evidence, according to National Standard's validity framework; and 3) to judge whether the simulator scoring system is an appropriate measure of DKA management skills of undergraduate and postgraduate medical trainees.

Design: After developing the DKA simulator, we completed usability testing to optimize its functionality. We then conducted a preliminary validation of the scoring system for measuring trainees' DKA management skills.

Participants: We recruited year 1 and year 3 medical students, year 2 postgraduate trainees, and endocrinologists (n = 75); each completed a simulator run, and we collected their simulator-computed scores.

Main measures: We collected validity evidence related to content, internal structure, relations with other variables, and consequences.

Key results: Our simulator consists of six cases highlighting DKA management priorities. Real-time progression of each case includes interactive order entry, laboratory and clinical data, and individualised feedback. Usability assessment identified issues with clarity of system status, user control, efficiency of use, and error prevention. Regarding validity evidence, Cronbach's α was 0.795 for the seven subscales indicating favorable internal structure evidence. Participants' scores showed a significant effect of training level (p < 0.001). Scores also correlated with the number of DKA patients they reported treating, weeks on Medicine rotation, and comfort with managing DKA. A score on the simulation exercise of 75 % had a sensitivity and specificity of 94.7 % and 51.8%, respectively, for delineating between expert staff physicians and trainees.

Conclusions: We demonstrate how a simulator and scoring system can be developed, tested, and refined to determine its quality for use as an assessment modality. Our evidence suggests that it can be used for formative assessment of trainees' DKA management skills.

No MeSH data available.


Related in: MedlinePlus

Mean score, percentage of actions correct, and number of critical errors by level of training. Error bars indicate standard deviation; undergraduate medical students in year 1 (MS1) with limited knowledge and expertise, undergraduate medical students in year 3 (MS3), postgraduate trainees in year 2 of internal medicine residency (PGY2), and staff endocrinologists.
© Copyright Policy - OpenAccess
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4539336&req=5

Fig1: Mean score, percentage of actions correct, and number of critical errors by level of training. Error bars indicate standard deviation; undergraduate medical students in year 1 (MS1) with limited knowledge and expertise, undergraduate medical students in year 3 (MS3), postgraduate trainees in year 2 of internal medicine residency (PGY2), and staff endocrinologists.

Mentions: Simulator refinement based on heuristic evaluation


The ABCs of DKA: Development and Validation of a Computer-Based Simulator and Scoring System.

Yu CH, Straus S, Brydges R - J Gen Intern Med (2015)

Mean score, percentage of actions correct, and number of critical errors by level of training. Error bars indicate standard deviation; undergraduate medical students in year 1 (MS1) with limited knowledge and expertise, undergraduate medical students in year 3 (MS3), postgraduate trainees in year 2 of internal medicine residency (PGY2), and staff endocrinologists.
© Copyright Policy - OpenAccess
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4539336&req=5

Fig1: Mean score, percentage of actions correct, and number of critical errors by level of training. Error bars indicate standard deviation; undergraduate medical students in year 1 (MS1) with limited knowledge and expertise, undergraduate medical students in year 3 (MS3), postgraduate trainees in year 2 of internal medicine residency (PGY2), and staff endocrinologists.
Mentions: Simulator refinement based on heuristic evaluation

Bottom Line: Participants' scores showed a significant effect of training level (p < 0.001).Scores also correlated with the number of DKA patients they reported treating, weeks on Medicine rotation, and comfort with managing DKA.Our evidence suggests that it can be used for formative assessment of trainees' DKA management skills.

View Article: PubMed Central - PubMed

Affiliation: St. Michael's Hospital, Toronto, ON, USA, yuca@smh.ca.

ABSTRACT

Background: Clinical management of diabetic ketoacidosis (DKA) continues to be suboptimal; simulation-based training may bridge this gap and is particularly applicable to teaching DKA management skills given it enables learning of basic knowledge, as well as clinical reasoning and patient management skills.

Objectives: 1) To develop, test, and refine a computer-based simulator of DKA management; 2) to collect validity evidence, according to National Standard's validity framework; and 3) to judge whether the simulator scoring system is an appropriate measure of DKA management skills of undergraduate and postgraduate medical trainees.

Design: After developing the DKA simulator, we completed usability testing to optimize its functionality. We then conducted a preliminary validation of the scoring system for measuring trainees' DKA management skills.

Participants: We recruited year 1 and year 3 medical students, year 2 postgraduate trainees, and endocrinologists (n = 75); each completed a simulator run, and we collected their simulator-computed scores.

Main measures: We collected validity evidence related to content, internal structure, relations with other variables, and consequences.

Key results: Our simulator consists of six cases highlighting DKA management priorities. Real-time progression of each case includes interactive order entry, laboratory and clinical data, and individualised feedback. Usability assessment identified issues with clarity of system status, user control, efficiency of use, and error prevention. Regarding validity evidence, Cronbach's α was 0.795 for the seven subscales indicating favorable internal structure evidence. Participants' scores showed a significant effect of training level (p < 0.001). Scores also correlated with the number of DKA patients they reported treating, weeks on Medicine rotation, and comfort with managing DKA. A score on the simulation exercise of 75 % had a sensitivity and specificity of 94.7 % and 51.8%, respectively, for delineating between expert staff physicians and trainees.

Conclusions: We demonstrate how a simulator and scoring system can be developed, tested, and refined to determine its quality for use as an assessment modality. Our evidence suggests that it can be used for formative assessment of trainees' DKA management skills.

No MeSH data available.


Related in: MedlinePlus