Skip to McMaster Navigation Skip to Site Navigation Skip to main content
McMaster logo

POSTGRADUATE MEDICAL EDUCATION

Competency Based Medical Education

Work Based Assessments

In Competence by Design (CBD), competency involves more than ‘know how’. It requires that learners also ‘show how’ and demonstrate the ability to ‘do’ independently. While learning is often associated with teaching, there is also high value in using assessment as a learning tool.

Workplace-based assessment (WBA) maintains this formative focus by having frontline clinical teachers observe and document authentic observations in the workplace on a regular basis. The results of individual observations are shared with learners in a way that guides learning improvement. When these individual workplace-based assessments are aggregated over time, the data from multiple observations and multiple sources gives a clearer picture about a learner’s performance and progress.

One goal of WBA is to collect as much information as possible about a learner, from as many sources as possible. Over time this data is pooled together to create a comprehensive image of the learner’s competence and help inform competence committee discussions about resident promotion. By shifting these broader promotion discussions to the competence committee, interactions between individual learners and observers can focus on coaching feedback designed to help improve learners’ performance.

In CBD, both learners and observers initiate practice observations. These may be direct or indirect. Optimal performance feedback results from direct observation, however this is not always feasible given workflow demands or desire for increasing learner independence. Regardless of the WBA tool used, or whether observations are direct or indirect, clinical teachers assessing learner performance should remember:

• Narrative comments focused on behavior specifics are the most valuable information in any WBA tool. Good comments provide trainees with detailed guidance for improvement and competence committees with rich context for the performance ratings.

• Isolated practice activities are linked to, but not inclusive determinants of EPA achievement (e.g. performing a knee exam is only one part of the osteoarthritis management EPA).

• WBA tools provide performance rating information and feedback specific to only that activity and context. As an observer, you are not deciding a trainee’s overall competence moving forward.

• Learner progression decisions are informed by multiple observations using an entrustability scale.

What is an entrustabilty scale?  

These scales use entrustment anchors to rate a Learner’s ability to safely and independently perform practice activities indicative of EPA achievement.  The following entrustability scale is incorporated within several WBA tools endorsed by the Royal College.

THE NEW EXPANDED O-SCALE ANCHORS ARE NOW UPDATED IN MedSIS

Level 1: “I had to do” (the physician had to perform the clinical activity while the resident observed)
Level 2: “I had to talk them through” (the resident required constant direction)
Level 3: “I had to prompt them from time to time” (the resident required frequent direction)
Level 4: “I needed to be there just in case
REVISED: “I had to provide minor direction” (the resident required minor direction)
Level 5: “I did not need to be there
REVISED: “I did not need to provide direction for safe and independent care” (no direction was required for safe independent care)