Improvement in Scoring Consistency for the Creighton Simulation Evaluation Instrument ©

Mary E. Parsons, Kimberly S. Hawkins, Maribeth Hercinger, Martha Todd, Julie A. Manz, Xiang Fang

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

Background: The Creighton Simulation Evaluation Instrument © is a valid and reliable instrument for evaluation of students in the simulated clinical environment. Increased use of simulation and requests for widespread use of the instrument prompted the development of an educational program to maintain reliability. Method: The educational program included review of the instrument and dialogue among faculty to determine expectations of student performance. Percentage agreement and kappa coefficient scores were calculated pre- and posttraining. Results: Improved percentage agreement scores were noted. Seventy-five percent of the kappa scores after implementation of the educational program demonstrate moderate, substantial, or almost perfect agreement. Conclusions: Education and faculty dialogue improve consistency of student assessment and add richness to the learning experience.

Original languageEnglish (US)
Pages (from-to)e233-e238
JournalClinical Simulation in Nursing
Volume8
Issue number6
DOIs
StatePublished - Jul 1 2012

All Science Journal Classification (ASJC) codes

  • Modeling and Simulation
  • Education
  • Nursing (miscellaneous)

Fingerprint Dive into the research topics of 'Improvement in Scoring Consistency for the Creighton Simulation Evaluation Instrument <sup>©</sup>'. Together they form a unique fingerprint.

Cite this