Background: In the current healthcare environment, education for technical skills focuses on quality improvement that demands ongoing skill assessment. Objectively assessing competency is a complex task that, when done effectively, improves patient care. Current methods are time-consuming, expensive, and subjective. Crowdsourcing is the practice of obtaining services from a large group of people, typically the general public on an online community. CSATS (Crowd Sourced Assessment of Technical Skills) uses crowdsourcing as an innovative way to rapidly, objectively, and comprehensively assess technical skills. We hypothesized that CSATS could accurately evaluate the technical skill proficiency of nurses.
Methods: An interface displaying one of 34 video-recorded nurses performing a glucometer skills test and a corresponding survey listing each required step were uploaded to an Amazon.com hosted crowdsourcing site, Mechanical Turk™. The crowd evaluated completion and sequence of the glucometer steps in each video.
Results: In under 4 hours, we obtained 1,300 crowd ratings, approximately 38 per video that evaluated the user’s performance based on completion and correct order of steps. The crowd identified individual performance variance, specific steps frequently missed by users, and provided feedback tailored to each user. CSATS identified 15% of nurses who would benefit from additional training.
Conclusion: Our study showed that healthcare-naïve crowd workers can assess technical skill proficiency rapidly and accurately at nominal cost. CSATS may be a valuable tool to assist educators in creating targeted training curricula for nurses in need of follow up while rapidly identifying nurses whose technical skills meet expectations, thus, dramatically reducing the resource burden for training.
Published on: Aug 26, 2016 Pages: 40-44