Validity and reliability of an objective structured assessment tool for performance of ultrasound-guided regional anaesthesia

Br J Anaesth. 2018 Oct;121(4):867-875. doi: 10.1016/j.bja.2018.06.014. Epub 2018 Jul 31.

Abstract

Background: We examined the validity and reliability of the previously developed criterion-referenced assessment checklist (AC) and global rating scale (GRS) to assess performance in ultrasound-guided regional anaesthesia (UGRA).

Methods: Twenty-one anaesthetists' single, real-time UGRA procedures (total: 21 blocks) were assessed using a 22-item AC and a 9-item GRS scored on 3-point and 5-point Likert scales, respectively. We used one-way analysis of variance to compare the assessment scores between three groups (Group 1: ≤30 blocks in the preceding year; Group 2: 31-100; and Group 3: >100). The concurrent validity was evaluated using Pearson's correlation (r). We calculated Type A intra-class correlation coefficient using an absolute-agreement definition in two-way random effects model, and inter-rater reliability using an absolute agreement between raters. The inter-item consistency was assessed by Cronbach's α.

Results: The greater UGRA experience in the preceding year was associated with better AC [F (2, 18) 12.01; P<0.001] and GRS [F (2, 18) 7.44; P=0.004] scores. There was a strong correlation between the mean AC and GRS scores [r=0.73 (P<0.001)], and a strong inter-item consistency for AC (α=0.94) and GRS (α=0.83). The intra-class correlation coefficient (95% confidence interval) and inter-rater reliability (95% confidence interval) for AC were 0.96 (0.95-0.96) and 0.91 (0.88-0.95), respectively, and 0.93 (0.90-0.94) and 0.80 (0.74-0.86) for GRS.

Conclusions: Both assessments differentiated between individuals who had performed fewer (≤30) and many (>100) blocks in the preceding year, supporting construct validity. It also established concurrent validity and overall reliability. We recommend that both tools can be used in UGRA assessment.

Keywords: anaesthetists; checklist; educational assessment; quality; reproducibility of results; ultrasound.

Publication types

  • Observational Study

MeSH terms

  • Anesthesia, Conduction / methods*
  • Anesthesia, Conduction / standards*
  • Checklist
  • Clinical Competence
  • Educational Measurement
  • Humans
  • Observer Variation
  • Reproducibility of Results
  • Ultrasonography, Interventional / methods*
  • Ultrasonography, Interventional / standards*