Introduction To achieve an expert performance of care teams, adequate simulation-based team training courses with an effective instructional design are essential. As the importance of the instructional design becomes ever more clear, an objective assessment tool would be valuable for educators and researchers. Therefore, we aimed to develop an evidence-based and objective assessment tool for the evaluation of the instructional design of simulation-based team training courses. Methods A validation study in which we developed an assessment tool containing an evidence-based questionnaire with Visual Analogue Scale (VAS) and a visual chart directly translating the results of the questionnaire. Psychometric properties of the assessment tool were tested using five descriptions of simulation-based team training courses. An expert-opinion-based ranking from poor to excellent was obtained. Ten independent raters assessed the five training courses twice, by using the developed questionnaire with an interval of 2 weeks. Validity and reliability analyses were performed by using the scores from the raters and comparing them with the expert's ranking. Usability was assessed by an 11-item survey. Results A 42-item questionnaire, using VAS, and a propeller chart were developed. The correlation between the expert-opinion-based ranking and the evaluators' scores (Spearman correlation) was 0.95, and the variance due to subjectivity of raters was 3.5% (V Training∗Rater). The G-coefficient was 0.96. The inter-rater reliability (intraclass correlation coefficient (ICC)) was 0.91 (95% CI 0.77 to 0.99), and intra-rater reliability for the overall score (ICC) was ranging from 0.91 to 0.99. Conclusions We developed an evidence-based and reliable assessment tool for the evaluation of the instructional design of a simulation-based team training: the ID-SIM. The ID-SIM is available as a free mobile application.
- instructional design
- team training