PDF MLA-thematic-report pdf-80181648
Thematic report on Clinical and Professional Skills Assessment formative meetings

Thematic report - Clinical and Professional Skills Assessment formative meetings
Introduction
Between March and July 2019 we met with medical schools and our team who deliver the Professional and Linguistic Assessment Board (PLAB)* test - they all provide high-stakes clinical skills assessments across the UK. We did this to better understand these providers' readiness to deliver the Medical Licencing Assessment (MLA) Clinical and Professional Skills Assessment (CPSA), and to further refine the requirements we will use for quality assurance purposes. This report summarises the key themes from these meetings.
The Medical Licensing Assessment 1 The MLA creates a demonstration that those who obtain registration with a licence to
practise medicine in the UK meet a common threshold for safe practice. 2 The MLA consists of two parts:
a The Applied Knowledge Test (AKT): a test of applied medical knowledge, to be taken by all candidates
b The Clinical and Professional Skills Assessment (CPSA): the final, high stakes clinical examination run by medical schools for their students, and the CPSA which we will run for international medical graduates (IMGs). Each CPSA must comply with a set of requirements determined by us.
* The Professional and Linguistic Assessment Board test, or the PLAB test, helps us to make sure doctors who qualified abroad have the right knowledge and skills to practise medicine in the UK.

This report concentrates on the CPSA. Below we describe the CPSA in more detail.
Background to the CPSA The CPSA for the MLA is defined as the final high stakes clinical assessment provided by:
 UK medical schools for medical students in their penultimate or final year of undergraduate education, and
 the GMC, as we will provide a CPSA for those IMGs who wish to practise in the UK and must demonstrate their knowledge and skills through taking the MLA.
The primary objective of the CPSA is to demonstrate that an individual is capable of functioning safely as they enter clinical practice in the UK.
From 2023 each CPSA must meet a number of CPSA requirements set by us (Annex A). The purpose of the CPSA requirements is to make sure that an assessment provider meets the primary objective by demonstrating the quality, consistency and fairness of their CPSA to a standard that we require for the CPSA to count towards a candidate's MLA. The requirements are based on scholarship and good practice.
We will establish a quality assurance process to confirm whether an assessment provider's* CPSA complies with the CSPA requirements.
We published a skeleton version of the CPSA requirements in June 2018. To refine the requirements and explore assessment providers' readiness to meet the requirements, we met with UK medical schools and our team who deliver the PLAB Part 2 exam. These meetings led to some important revisions to the requirements. In this report we explain what we did, what we heard from the assessment providers about some of the requirements, and the changes we made.
The meetings with assessment providers
Between March and July 2019, we ran a series of voluntary formative meetings with assessment providers. We met 37 providers in total. This included 36 medical schools and our team who deliver the PLAB Part 2 exam.
The aims of the meetings were to understand:
 each assessment provider's readiness to meet the CPSA requirements
* This document refers to all medical schools and the GMC's PLAB Part 2 exam as `assessment providers' and all test-takers as `candidates'.  The CPSA will replace PLAB Part 2.
2

 the types of evidence they may wish to submit for each requirement
 any requirements that could prove challenging to provide evidence for and what, if any, support they may require to meet the requirements.
Before the meetings, we sent assessment providers an example return against the draft CPSA requirements, which illustrated what a narrative could look like during the formative engagement phase (Annex B).
In each meeting, the representatives from each assessment provider gave a high-level overview of the design of their CPSA and the policies, resources and processes they have in place to deliver the CPSA. This was followed by a detailed discussion about each requirement, concentrating on the narrative to support each requirement rather than reviewing individual pieces of evidence.
After each meeting, we provided a high-level feedback report to the assessment provider. This specified up to five areas across the requirements where things appeared to be working well, and areas where the discussion suggested that it would be useful for the assessment provider to reflect on their rationale or process in relation to particular requirements. Given that the meetings were formative and voluntary in nature, the highlevel feedback report was not intended to be a full review of the discussion.
Summary
The meetings revealed there are a number of approaches to the assessment of clinical and professional skills. The majority of assessment providers use an Objective Structured Clinical Examination (OSCE), with a smaller number using a different format such as an Objective Structured Long Examination Record (OSLER), or a bespoke assessment combining short and long stations. Among medical schools, these assessments are delivered either in the penultimate year, or the final year; and for some as a standalone assessment, and for others as a linked or multi-component assessment.
While the formats of CPSAs tended to be very different, many features of the assessment framework were similar across providers. The similarities included the methods used to ensure candidates were familiar with the details of the assessment and the suite of policies to support the delivery of the CPSA, such as reasonable adjustments.
In general, assessment providers said they felt well-prepared to meet many of the requirements. In some cases, the process of preparing for the formative meeting had helped providers to identify gaps in their evidence-base or in the rationale for their approach to some aspects of the assessment. Some providers self-reported that they had identified areas for further development.
The meetings were informative and constructive, and discussing the requirements in detail supported our work to develop a further version of the requirements. An amended version of the requirements is annexed to this report.
3

Discussion of the requirements
Below we report on the themes emerging from discussions about some, but not all, of the requirements. We begin by outlining a general summary of what we heard from the assessment providers. We then report on selected requirements where the meetings revealed consistency in approach among assessment providers or where there is particular variability. Within this report, we do not comment on what approach will or will not comply with the CPSA requirements. The discussion is presented in the order of the requirements, rather than by importance or priority.
The formative meetings did not include scrutiny of evidence or any judgement on whether a particular provider's CPSA complied with the requirements. This was not the purpose or intention of the discussions. At the meetings we purposefully relied on self-reports by the assessment providers and the discussion of the requirements below summarises the selfreported information we heard across the 37 assessment providers.
Requirement 3: Familiarisation with the assessment process
Requirement Demonstrate how the candidates have been given information about the CPSA well in advance, including:
a assessment format
b scoring and standards
c how the CPSA will be run on the day.
What assessment providers told us Assessment providers use a range of strategies to support candidates in their preparedness for the CPSA. Opportunities are often distributed throughout the year, and include:
 assessment handbooks, and written assessment guidance that is specific to the penultimate/final year assessments, including the CPSA
 a full or partial mock-version of the CPSA  the provision of support, by way of resources and space, for peer-led mock CPSAs  lectures on assessment, either face-to-face and/or video-recorded and available
on the virtual learning environment, or an audio voice-over for a PowerPoint presentation
4

 sharing assessment materials, station instructions and/or mark schemes, for reading or to support peer-led mock CPSAs or practice sessions
 frequently asked questions published on the virtual learning environment  formal and informal revision and practice sessions with clinical tutors. In addition, some assessment providers have innovative approaches to prepare their candidates for the CPSA, for example, some medical schools allow final or penultimate year students to take on an examiner or patient role in formative CPSAs for medical students in earlier years.
Requirement 4: CPSA construction and delivery
Requirement Demonstrate how a range of appropriate stakeholders is involved in the construction and/or delivery of the CPSA.
What assessment providers told us During the construction of CPSA stations, some assessment providers ask real and/or simulated patients to comment on the scenarios and scripts to support the authenticity of the assessment. Some assessment providers also invite patients to an event where new stations are fully tested, and revised in the light of feedback received. There is variation in the pre-testing of CPSA stations. A number of assessment providers write and deliver CPSA stations without pre-testing, while others use a range of approaches such as testing new stations with F1 doctors, using new stations in formative CPSAs before the summative assessment, or conducting a full run through of the circuit with staff, examiners and patients.
Requirement 8: CPSA design
Requirement Demonstrate the rationale for the assessment approach used for the CPSA. This should include:
a format (OSCE, OSLER, MOSLER, PACES etc.)
b station design
c testing time, including number and duration of stations
5

d approach to scoring candidate performance.
What assessment providers told us
Approach to scoring candidate performance All assessment providers give guidance to examiners by way of global rating scale descriptors. These are used to support consistency of examiners' judgements (which may form the basis of the cut-score calculations). Some assessment providers also provide station-specific marking guidance to help examiners differentiate between different levels of attainment. Across medical schools, this type of guidance often includes information on what the students had been taught on the topics or the skills or knowledge domains that were central to the station task.
Furthermore, several assessment providers demonstrated how they make robust evidencebased changes to the way they approach their CPSA. For example, detailed data modelling to support the implementation of domain-based scoring.
Requirement 9: CPSA design
Requirement Demonstrate how station writers are trained (eg through station writing workshops run by members of the faculty who are experienced station writers).
What assessment providers told us There are varying levels of support for CPSA station writers and, in particular, new station writers. Some assessment providers offer training and mentoring, which may include pairing a novice writer with an experienced writer. A range of support materials, such as standardised station templates and guidance, is also available from some assessment providers. In other cases, station writing is conducted partially, or fully, by staff members with responsibility for the year, the CPSA, or particular areas of the curriculum.
Requirement 11: Standard setting
Requirement Demonstrate how standards are set and why, including:
a standard setting method at both station and overall assessment level
b any additional passing criteria (eg minimum number of stations passed).
6

What assessment providers told us
We heard of three different approaches to standard setting. From most to least frequent, they were:
 Use of the Borderline Regression Method (BRM). This is used by most assessment providers. It is applied at station-level and typically averaged across stations to give an overall CPSA cut-score. There is variation in the placement of the intersect on the global rating scale, with some using halfway between Borderline and Pass, others using halfway between Borderline Fail and Borderline Pass, and others at Borderline.
 Use of the Angoff, or a modified Angoff, method. This is used by other assessment providers where either the cohort size rules out the use of BRM, the station bank is new and developing, or there is more expertise in Angoff than BRM. A slightly larger number of assessment providers use Angoff in the case of stations for a resit (where Angoff is not used for a resit, typically the last use BRM station standard is carried over).
 Use of a bespoke method. A very small number of assessment providers have developed a bespoke approach to setting the standard within stations, or overall, that they feel best meets the purpose and design of their assessment. These include a modification to BRM, using a mix of Angoff and BRM for different stations within the same circuit, and a method based on a single examiner decision per station and a requirement for a minimum number of station passes. In this latter case, examiners are not asked to provide within-station scores, but to make a single judgement as to whether the candidate has passed or failed the station, with the number of station passes then summed.
Most assessment providers apply a conjunctive standard in addition to the cut-score, based on a minimum number of station passes. Many assessment providers conduct data modelling to inform their choice of approach to standard setting, though modelling is typically in relation to the method and rarely the conjunctive standard.
Some assessment providers add one standard error of measurement (SEM) to their Angoff or BRM cut-score. Those using sequential testing add two SEM for Sequence 1, as a means of identifying those who are and are not required to take the Sequence 2. A small number of assessment providers subtract one SEM, but this is at station-level, rather than overall.
Assessment providers use a mix of domain, checklist, and, much less frequently, single judgement marking. As a result of the different approaches, and the close link between scoring and standard setting, we have added a new requirement that focuses specifically on approaches to scoring.
7

Requirement 12: Examiners
Requirement Demonstrate how examiners are recruited and trained. This should include:
a equality and diversity (E&D) training
b training events before the day of the CPSA covering examiner conduct, awareness of bias and scoring guidance
c calibration exercises to ensure that examiners have a common approach to identifying different levels of performance, especially borderline candidates.
What assessment providers told us
Equality, diversity and inclusion (ED&I) training for examiners and simulated patients While some assessment providers keep records of examiners' completed ED&I training and check that it is up to date before examiners assess the CPSA, others rely on examiners completing this training as part of an employer's requirements. There is variation between those assessment providers who require a declaration from examiners to confirm that the training has been undertaken and those who don't ask to see any evidence that it has been completed.
Where simulated patients are involved in the running of the CPSA, some assessment providers check that ED&I training has been completed through the actors' agency and, if not, mandate that simulated patients complete an ED&I training package. The training has often been developed by the provider's central university. Others do not collect this information, nor offer this training to simulated patients.
Examiner training events Assessment providers told us about the examiner training activities that take place throughout the year to prepare examiners and to calibrate their marking ahead of the CPSA. Typically, this is a face-to-face meeting, which might be delivered several times a year, and includes formal presentations, watching videos of mock stations, and undertaking practice marking with discussions of examiners' perceptions and benchmarking against a `gold standard' (of a chief examiner, panel of experienced examiners, or the senior academic leading the training). While some providers only encourage this training to be completed before live marking, most providers require completion before the assessment and keep of a record of this.
The calibration of examiners and simulated patients
8

Many assessment providers run their CPSA across multiple sites and sessions, often with parallel circuits in each. Across assessment providers there is variability in the resource and time allocated to ensure that examiners and simulated patients assessing the same station develop a consistent approach to scoring candidate performance. For example, some providers ensure that the station is discussed as a group and conduct a full run through of the station before the assessment begins, while other assessment providers reported that the timings of the CPSA circuit do not allow for these types of exercises.
Requirement 16: Resources and space
Requirement Show that the CPSA takes place in a suitable, secure space with access to appropriate resources.
What assessment providers told us Assessment providers utilise university and/or trust facilities to deliver their CPSAs, depending of the size of the candidature and the number of parallel circuits required. A small number of assessment providers have purpose built clinical assessment facilities that can accommodate the required number circuits. Others use clinical and/or educational spaces which are modified for the purposes of delivering the CPSA, for example, with the use of dividing screens or curtains. Among medical schools, students were largely familiar with the venues used for the CPSA or were given opportunity to visit the venue ahead of the assessment.
Where CPSAs were delivered over multiple sites, assessment providers reported seeking to ensure sufficient resource was allocated to staff co-ordination, oversight, and ensuring consistency of set-up, delivery, briefings and standards.
Requirement 19: Accurate data acquisition
Requirement Demonstrate the approach to accurate and consistent data acquisition, and dealing with missing data.
What assessment providers told us Many assessment providers use an electronic marking system for their CPSA, which allows them to mitigate the risk of missing marks.
9

Where assessment providers use a paper marking system, we heard of checks being conducted on the day of the assessment to reduce the risk of error in the collection of marks. This included asking examiners to stay on site while mark sheets are checked. Some of these providers reported that missing marks still occur to varying degrees of severity.
Requirement 21: Exam board
Requirement Demonstrate how assessment performance is analysed post-exam by a group with appropriate expertise, including looking at factors such as the performance of examiners, stations, simulated patients and examination sites, and how the data feeds back into a quality cycle.
What assessment providers told us All assessment providers reported that they generate and utilise psychometric data for the purposes of assuring their Exam Board about assessment quality and reliability of the outcomes. The data are also used to evaluate the performance of individual stations and to revise stations as required. The depth of analyses of the data varies across assessment providers; some explore performance across sites, circuits and sessions as well as cohort demographics and examiner performance, while others produce more limited data following the CPSA.
Psychometric data are generated either by a software package, a data analysis unit within the faculty or university division, one or more psychometricians employed by the assessment provider, or an external consultant psychometrician. Where these analyses are carried out manually and in-house, there is variation in the spread of expertise, and levels of support for staff, to fulfil this function.
Requirement 22: Exam board
Requirement Demonstrate how unprofessional candidate, examiner and simulated patient behaviours during the CPSA are captured and dealt with (eg cause for concern/yellow card).
What assessment providers told us Some assessment providers use strategies to identify aspects of candidates' unprofessional behaviour during the CPSA, for example through a system of yellow and red cards or flags, and documents for examiners and/or patients to record their concerns. Others have
10

less defined processes for identifying and escalating concerns of this nature and policies for reviewing and dealing with them. For example, approaches to how professionalism concerns may affect a candidate's CPSA outcome and progression, vary across assessment providers.
Requirement 24: Results and feedback to candidates
Requirement Demonstrate what individual results (pass/fail, grades) and feedback (eg narratives) are given to candidates and why.
What assessment providers told us Assessment providers differ in the extent of performance-based feedback they provide to candidates. Typically, assessment providers share quantitative indicators of individual and cohort performance at the station-level and overall. Where assessment providers are able to process examiner feedback comments, these are also usually made available to candidates. Where assessment providers are not able to provide examiner comments to all candidates, they are typically made available for candidates who failed the CPSA.
Free-text feedback from examiners To support them in their learning, many assessment providers give candidates examiners' free text comments. While training is often given to examiners on how to provide effective feedback, there is variation in the thoroughness of the quality checks on these feedback comments to ensure they are appropriate before they are sent to candidates.
Support for failing candidates* Assessment providers reported a range of processes to support candidates who've received a fail outcome for the CPSA. Among medical schools these include one to one meetings for students with teaching staff to discuss the results and feedback, developing support plans and providing revision or one-to-one coaching sessions in advance of the resit.
* As a result of the discussions with assessment providers, in the pilot version of the requirements, support for failing candidates has become a separate requirement
11

Next steps
Revising the requirements for piloting
The version of the requirements published in June 2018 and forming the basis for the formative meetings with assessment providers was structured in three sections: `before, during and after' the assessment. This structure allowed us to comprehensively capture all the elements of a CPSA. Feedback from medical schools, both individually and via meetings and workshops organised by the Medical Schools Council (MSC), was positive about this structure. However, the structure leads to some repetition and the formative meetings with the 37 assessment providers rarely discussed the requirements in this way. The conversations were, instead, more thematic in nature rather than following the chronological process of a CPSA.
Based on the discussions with assessment providers, the requirements have been reformatted to reflect the main features of CPSA development and delivery.
 Design  Content  Preparation of and support for candidates  Preparedness of examiners and patients for the CPSA  Policies and resources  Data management  Evaluation and quality assurance.
Inevitably, some overlap will remain due to the interconnected nature of a CPSA. We anticipate that this new structure should make it easier for assessment providers to compile their evidence and will allow us to scrutinise evidence more efficiently as part of the quality assurance process established to check whether a provider complies with the requirements.
Content The major changes are:
 Requirement 2 ­ Scoring: How examiners score candidates was previously a subset of the CPSA design requirement. During the meetings with assessment providers, it became clear that there is a variety of approaches to scoring within stations, which we need to understand in greater detail to assure ourselves how candidates' marks are allocated.
12

 Requirement 3 ­ Assessing professionalism: We've isolated the professionalism requirement, as assessment providers have varying approaches to capturing unprofessional behaviours in the CPSA.
 Requirement 7 ­ CPSA construction: We've refocussed this requirement on how assessment providers can demonstrate they've got the appropriate range of stakeholder engagement in constructing their stations: for example, including the patient perspective when writing the brief for the simulated patient, or piloting new stations on Foundation Year 1 doctors.
 Requirement 8 ­ Quality of CPSA content: We've expanded the requirement on how station writers are trained, to encompass how assessment providers assure us that station quality is considered at all stages, from piloting to incorporating feedback post-exam.
 Requirement 12 ­ Preparing candidates for a resit: This is a new requirement. We'd previously asked about how candidates were prepared to take the CPSA, but we hadn't covered how a failing candidate was supported for the resit.
 Requirement 22 ­ Internal quality assurance: This is a new requirement. We'd previously asked about policies and procedures, resources and space, and the external examiner, but we hadn't explicitly covered issues like how assessment providers assure themselves that candidates will have a similar experience at a multi-site CPSA.
We've also merged several requirements where they were not distinct (the external examiner pro-forma was subsumed into the general external examiner requirement, and the briefing of candidates on the day became part of the general candidate preparation requirement) and made minor amendments to the wording of others.
The amended requirements will be used during the next stage of developing the MLA; a pilot of the CPSA review process beginning in early 2020. This version of the requirements will be subject to review post-pilot.
13

Annex A
September 2019 ­ CPSA requirements for piloting
Requirements for the Medical Licensing Assessment Clinical and Professional Skills Assessment
Background
The Medical Licensing Assessment 1 The Medical Licensing Assessment (MLA) is a demonstration that those who obtain
registration with a licence to practise medicine in the UK meet a common threshold for safe practice.
2 The MLA consists of two parts:
a The Applied Knowledge Test (AKT): a test of applied medical knowledge, to be taken by all candidates.
b The Clinical and Professional Skills Assessment (CPSA): the final, high stakes clinical examination run by medical schools for their students, and the CPSA which we will run for international medical graduates (IMGs).
Purpose of the CPSA 3 The primary objective of the CPSA is to demonstrate that an individual is capable of
functioning safely as they enter clinical practice in the UK.
Purpose of this document 4 This document specifies the requirements that each assessment provider's* CPSA
must meet.
5 The purpose of these requirements is to make sure that an assessment provider is meeting the primary objective by demonstrating the quality, consistency and fairness of their CPSA to a standard that we require for the CPSA to count towards a candidate's MLA. In identifying these requirements, we have ensured they are based on scholarship and good practice.
* This document refers to all medical schools and the GMC as `assessment providers' and all test-takers as `candidates'.
14

6 The CPSA is the final, high stakes clinical assessment, irrespective of the format that each provide has chosen to use (eg Objective Structured Clinical Examination (OSCE), Objective Structured Long Examination Record (OSLER), Practical Assessment of Clinical Examination Skills (PACES)).
7 Assessment providers will be asked to submit evidence to us to show how their CPSA meets these requirements. This evidence will be reviewed to decide whether assessment providers meet the primary objective.
Updates from the June 2018 version 8 This draft is an update to reflect the feedback and learning from our engagement
with stakeholders since the original draft requirements were published in June 2018, most particularly the formative exercise run with medical schools between March and July 2019. This version of the requirements will be subject to review post-pilot.
Design
Assessment strategy 1 Demonstrate how the CPSA sits within the overall suite of assessments for the final
and penultimate years, eg workplace based assessments (WPBA) and clinical procedural skills.
Suggested evidence i Assessment strategy or programme assessment map. ii Evidence that individual candidate performance has been reviewed and progression decisions are made consistent with procedures (eg minutes from exam boards/progress panels showing that only candidates eligible to progress enter the CPSA).
CPSA design 2 Demonstrate the rationale for the design of the CPSA. This should include:
a format b station type c testing time, including number and duration of stations.
15

Suggested evidence i Description of the CPSA and explanation of the rationale underpinning the design of the CPSA, including format (OSCE, OSLER, MOSLER, PACES etc.), station type (long case, integrated skills, etc.) and testing time, including number and duration of stations.
Scoring 3 Describe the rationale for the approach to scoring candidate performance:
a within station (eg domain/checklist/overall global judgement)
b how results are aggregated at the level of the overall assessment
c any marks or judgements given by the simulated or real patient, and how they contribute to the overall score.
Suggested evidence i Example station materials including a marksheet or marksheets showing scoring (individual items and global descriptors), examples of weightings, rating scales and any anchor statements/other examiner guidance.
ii Any generic scoring guidance, eg generic anchor statements/descriptions of the borderline/just passing candidate.
iii Description of how overall CPSA scores are calculated and outcomes determined.
iv Example of the rating scale and scores used by the simulated or real patient, if applicable.
Standard setting 4 Describe how standards are set for the first-take and resit, as applicable, and the
underlying rationale for the chosen method/s, including:
a standard setting method at station and overall assessment level
b any additional passing criteria (eg minimum number of stations passed)
Suggested evidence i Detailed description of standard setting method/s and the application within and across stations (including approaches to compensation within the CPSA or across different assessment components).
16

ii Description and rationale for any additional standard setting criteria, eg use of one or more standard errors of measurement.
Assessing professionalism 5 Demonstrate how professionalism is assessed during the CPSA and unprofessional
behaviours are captured and followed up.
Suggested evidence i Description of how professionalism is assessed. ii The process for logging and addressing concerns relating to unprofessional behaviours (eg cause for concern/yellow card) and its role in determining the outcome of the CPSA.
Content
Content sampling The MLA content map is informed by Outcomes for graduates, the Foundation Programme training outcomes, the Generic professional capabilities framework and Good medical practice. 6 Show how the CPSA content relates to the MLA content map:
a Demonstrate that the CPSA maps to the three overarching themes: i Readiness for safe practice ii Managing uncertainty iii Delivering person-centred care
b Demonstrate how the CPSA maps to the individual domains: i Areas of clinical practice ii Areas of professional knowledge iii Clinical and professional capabilities iv Practical skills and procedures v Patient presentations
17

vi Conditions
c Demonstrate that candidates can identify and interpret clinical findings
Suggested evidence i Evidence that the overall CPSA blueprint is mapped to the three overarching themes of the content map and that candidates demonstrate a level of competence across the content map domains. This could include a worked example of mapping the content of a single CPSA to the themes and domains in 6a and 6b.
ii A worked example of a single CPSA showing where and how candidates can demonstrate their ability to identify and interpret clinical findings.
CPSA construction 7 Demonstrate how a range of appropriate stakeholders is involved in the creation and
development of stations to assure their authenticity and level of challenge.
Suggested evidence i Description of the processes for ensuring that stations are set at the level for entering clinical practice in the UK and reflect what doctors might encounter in the workplace.
ii Description of processes for ensuring that stations are authentic from the patient's perspective.
Quality of CPSA content 8 Demonstrate how stations are created and quality is maintained. This should include:
a how station writers are trained
b the process for creating and approving new stations
c how feedback collected on the day of the CPSA and post-exam station metrics are fed into the writing and review process.
Suggested evidence i Case study showing the lifecycle of a station.
ii Details of the training programme and materials for new station writers, including how these skills remain current.
18

iii Description of station review process, including examples of feedback and post-exam station metrics, and the revisions made to stations.
Security of CPSA content 9 Demonstrate how the security of the assessment content is maintained.
Suggested evidence i Narrative explaining how security is achieved, including details of the process for station usage/review/revision/storage and sharing.
Preparation of and support for candidates
Familiarisation with the assessment process for candidates 10 Demonstrate how candidates have been given information about the CPSA well in
advance, and briefed on the day, covering: a assessment format, including scoring b expected standards of performance c how the CPSA will be run on the day.
Suggested evidence i Evidence of timing and methods of communication, eg talks (slides and/or video recording), virtual learning environment (VLE) announcements, ebulletins, handbooks, formative/mock CPSAs.
Results and feedback to candidates 11 Demonstrate what results and feedback are given to candidates and how the quality
of any feedback is assured.
Suggested evidence i Description of information provided to candidates, including results and feedback (eg examiners' free text comments). ii Description of processes for assuring the quality of feedback to candidates.
Preparing candidates for a resit/repeat assessment 12 Demonstrate what support is given to unsuccessful candidates.
19

Suggested evidence i Description of the remediation plan (eg feedback for unsuccessful candidates, availability of revision sessions).
Preparedness of examiners and patients for the CPSA
Examiners We encourage the inclusion of multi-professional, lay and training grade examiners. Professionally qualified examiners should be in good standing with the relevant regulatory body. 13 Demonstrate how examiners are recruited and trained. This should include:
a criteria for becoming an examiner
b training to support examiners' preparedness
c details of marking calibration
d details of equality, diversity and inclusion (ED&I) training.
Suggested evidence i Criteria for becoming an examiner.
ii Exemplar materials for training events, covering examiner conduct, awareness of bias, scoring guidance and training on giving feedback to candidates.
iii Details of marking calibration exercises to ensure that examiners have a common approach to identifying different levels of performance, especially borderline candidates.
iv Details of how examiner performance is monitored and feedback given.
Simulated/real patients 14 Describe how simulated/real patients are involved in the CPSA and demonstrate how
they are recruited, trained, briefed and calibrated.
Suggested evidence i Narrative detailing the involvement of simulated/real patients in the CPSA, and how they are trained and prepared for their role.
ii Familiarisation of examiners and simulated/real patients with station content
20

15 Demonstrate how the examiner and simulated/real patient for each station are given the opportunity to familiarise themselves with the station content.
Suggested evidence i Details of briefing and station level familiarisation proximal to the CPSA.
ii Evidence of how the examiner and patient prepare on the day of the exam, eg by rehearsing the station together, or with examiners and patients on parallel circuits.
Feedback to examiners and simulated patients 16 Demonstrate what feedback is given to examiners and simulated patients, and how
you monitor the effect of this feedback.
Suggested evidence i Description of how examiner and simulated patient performance is monitored during the exam.
ii Description of feedback provided to examiners and simulated patients.
Policies and resources
Policies and procedures 17 Demonstrate that there are policies and procedures in place to deal with all aspects
of the CPSA.
Suggested evidence i Written policies and standard operating procedures for the CPSA (eg roles and responsibilities of key staff, mitigating circumstances, reasonable adjustments, illness on the day, appeals process and unexpected incidents around the time of the CPSA, or this information in a Code of Practice for Assessment).
ii Description of how the principles in Welcomed and Valued are applied when determining the necessary level of support for candidates, including the provision of reasonable adjustments for disabled candidates.
Resources and space 18 Show that the CPSA takes place in a space appropriate for a high stakes assessment
with access to appropriate clinical resources.
21

Suggested evidence i All resource details, eg map/photographs/video of circuit and inventory of resources, as well as external examiner comments/observations regarding suitability of assessment environment.
Data management
Data acquisition
19 Demonstrate the approach to accurate and consistent data acquisition during the CPSA, and dealing with missing data.
Suggested evidence i A description of how scores are captured (eg on paper or tablet computer), and processes in place to ensure scores are accurate and complete (eg checks at the end of each session).
Production of results 20 Demonstrate how the assessment provider combines and checks results data to
produce results for the exam board.
Suggested evidence i Narrative describing the data processing that occurs between the completion of the CPSA and the exam board, including who is involved, what their responsibilities are, and what checks are in place to ensure accurate handling of data and calculation of results, including cross-checking.
Evaluation and quality assurance
Psychometric analysis 21 Demonstrate how assessment performance is analysed post-CPSA, including looking
at factors such as the performance of candidates, examiners, stations, simulated patients and examination sites, and how the data feedback into a quality cycle.
Suggested evidence i Description of the analyses that are carried out, including who is involved, what their responsibilities are, and what checks are in place to ensure accurate handling of data.
ii Example report of psychometric analysis.
22

Internal quality assurance 22 Describe the internal quality assurance processes for the CPSA, including how the
processes feed into post-CPSA review, evaluation and decision making. Suggested evidence
i Narrative describing the quality assurance for the CPSA, eg circuit walkthroughs prior to the CPSA, the role of internal examiners/leads on the day.
External examiners 23 Describe the role and input of the external examiner and how the assessment
provider responds to the external examiner's advice. Suggested evidence
i Evidence of how external examiners are recruited and briefed on their roles. ii Records of external examiners' reports and the formal institutional response to
them.
23

Annex B
March 2019 ­ CPSA example return
Evidence against the requirements and performance indicators for the MLA Clinical and Professional Skills Assessment (CPSA)
Background
The Medical Licensing Assessment 24 The Medical Licensing Assessment (MLA) is a demonstration that those who obtain
registration with a licence to practise medicine in the UK meet a common threshold for safe practice.
25 The MLA consists of two parts:
e The Applied Knowledge Test (AKT): an online test of applied medical knowledge, to be taken by all candidates.
f The Clinical and Professional Skills Assessment (CPSA): the final, high stakes clinical examination in each medical school, and the MLA CPSA which we will run for IMGs.
Purpose of the CPSA 26 The primary objective of the CPSA is to demonstrate that an individual is capable of
functioning safely on the first day of clinical practice in the UK.
CPSA requirements and performance indicators 27 The draft CPSA requirements and performance indicators outline advice to us from
the MLA Expert Reference Group's CPSA subgroup about what requirements and performance indicators to consider for the CPSA. The purpose of these indicators is to ensure that an assessment provider is meeting the primary objective by assuring the quality, consistency and fairness of their CPSA, and that candidates can achieve the prescribed standard of proficiency. In identifying these indicators, we have ensured they are based on scholarship and good practice.
28 The CPSA is the final, high stakes performance assessment, irrespective of format (eg OSCE, OSLER, MOSLER, PACES), or sitting (eg main exam, resit/reassessment). All mentions of the CPSA in this document refer to each assessment provider's CPSA.
24

29 Assessment providers will be asked to submit evidence to us to show how they meet these requirements. This evidence will be reviewed to decide whether assessment providers meet the primary objective.
Purpose of this document 30 This document sets out an example of what a return against the draft CPSA
requirements might look like during the formative engagement phase. It is not intended to be exhaustive, or prescriptive and is not the only way that you might put together your return. 31 The aim of this example is to give you an understanding of the level of detail for the evidence for each requirement and form a basis for the conversation at your formative meeting with us. 32 The example is a starting point for discussions and to provide the basis for a full and final return in the future. In the example, we have concentrated on the narrative against each requirement. At this stage, we're not looking for you to provide an indepth evidence set, though you may find it helpful to list possible sources of evidence.
25

Evidence against the CPSA requirements
Before the CPSA
Eligibility for sitting the CPSA 33 Show that candidates have met the assessment provider's own eligibility criteria
before entering the CPSA, including good professional standing and satisfactory progression.
Suggested evidence iii Outline of the assessment strategy, including how knowledge, skills and behaviours are assessed prior to entry into the CPSA.
iv Evidence that individual student performance has been reviewed and decisions are made consistent with procedures (eg minutes from exam boards/progress panels) showing that only students eligible to progress enter the CPSA).
Narrative
· In order to progress to a subsequent clinical year, students are required to pass annual summative examinations. They are also required to be in good academic standing (measured by passing the modules /blocks they have undertaken, plus supervisor reports, attendance monitoring etc.)
· Students who have demonstrated competence in Year 4 by passing the written and clinical OSCE of the Year 4 (Intermediate Professional Examination (IPE)), in the main sit or the re-sit, will progress into the final year following ratification by the Board of Examiners.
· Following completion of final year placements, students are eligible to sit the Year 5 Final Professional Examination (FPE) OSCE in January.
Evidence provided (document names and hyperlinks)
· 2018 IPE main sit Panel Meeting minutes (anonymised)
· 2018 IPE Board of Examiner meeting minutes (anonymised)
· Section from the Code of Practice of Assessment stating the eligibility criteria
26

Other sources of evidence 34 Demonstrate how the CPSA sits within an overall assessment strategy, eg
workplace based assessments (WPBA) and clinical procedural skills.
Suggested evidence v Outline of the assessment strategy.
vi Programme assessment map/programme blueprint.
Narrative
The aim of the Medical School is to provide an excellent standard of education and assessment which mirrors the specifications of the GMC's standards for medical education and training - Promoting excellence (2015).
The primary purpose of assessment of the core curriculum is to ensure that all students develop cumulative and integrated knowledge and skills so that they are competent to practise and have an appropriate foundation for lifelong learning.
Furthermore, the Medical School is required to demonstrate that students can practise as safe future doctors. It is for this reason that all students must demonstrate that they have achieved the minimum safe standard for their stage of the course.
Assessments are therefore designed to identify those students who are not ready to progress from one year of the course to the next as well as those students who are performing exceptionally well.
The key feature of assessment is that, in terms of content, assessments are cumulative. The style of examination is also intended to test the application of this progressive competence to clinical problems, to encourage breadth of learning, and to discourage as strongly as possible the adoption of selective, focussed learning strategies.
The MB ChB programme is not a modular programme. The programme is taught in an integrated manner and all summative assessments are integrated. From 2019 uniform pattern of assessments with common principles for each year of the course have been developed.
Within every year of the MB ChB programme there is:
· A summative assessment
· This will normally consist of two components - a written assessment and a clinical/practical assessment; apart from Year 1 where the clinical examination is
27

formative. · Any student who is unsatisfactory in a summative examination will have the
opportunity to take a re-sit examination. A student will be required to undertake a re-sit examination in those components that were failed in the first sit examination. For clarity, if the written assessment and clinical or practical assessment have both been failed at the first sit then both are taken at the resit. If only the written assessment has been failed at the first sit then only the written assessment is taken in the resit and if only the clinical or practical assessment has been failed at the first sit then only the clinical or practical assessment is taken at the resit. · Prior to graduation, students who pass the CPSA will also have to demonstrate competence through completion of DOPS/skills portfolios, the Foundation apprenticeship placement and be in good professional standing. Evidence provided (document names and hyperlinks) · Section from the Code of Practice for Assessment detailing the resit policy · Programme of Assessment diagram · MBChB curriculum/programme overview
Familiarisation with the assessment process 35 Demonstrate how the candidates have been given information about the CPSA well
in advance, including: g assessment format h scoring and standards i how the CPSA will be run on the day.
Suggested evidence i Evidence of timing and methods of communication, eg talks (slides and/or video record), virtual learning environment (VLE) announcements, ebulletins, handbooks.
28

Narrative · Students undertaking the Final Professional Examination OSCE (CPSA) attend two lectures delivered by the Head of Assessment and the Lead for Clinical Assessment, which explain the overview and format of the OSCE, including scoring and standards. These are recorded and uploaded onto the Medical School's virtual learning environment (VLE) platform so the students can review them at any point prior to the OSCE. The second session is mainly a Q&A to address any concerns or queries that the students may have. · There is a section in the lectures which outlines what students can expect on the day and students are encouraged to contact the Assessment Office prior to the OSCE if they have any questions. · Students are emailed 2 weeks prior the OSCE with the date and time of their OSCE sitting, along with information about what to bring on the day. · On the day of the exam the candidates are provided with a briefing video prerecorded by the Lead for Clinical Assessment. · Students with physical disabilities who request adjustments apply via the AEA committee and are informed what adjustments will be provided prior to the CPSA.
Evidence provided (document names and hyperlinks) · 2019 FPE OSCE student lectures · 2019 FPE OSCE student briefing · Student OSCE information email · Year 5 OSCE information handbook
29

CPSA construction and delivery 36 Demonstrate how a range of appropriate stakeholders is involved in the
construction and/or delivery of the CPSA.*
Suggested evidence i Documentation of composition of stakeholders involved in development and delivery of the CPSA (eg details of membership of CPSA blueprinting/working group, patients/carers/NHS clinicians involved in development of stations).
Narrative
Patient representatives:
Simulated Patients attend the OSCE editing sessions for any new stations. They also review the OSCEs prior to attending the simulator OSCE training and suggest edits to the stations to improve them. They have been the driver behind introducing diversity into the OSCE circuits over the past two years, for example same sex OSCE stations.
Real patients are used for examinations in phase 2 of the course. They are asked for feedback on the stations they are involved in although this is not currently formally documented.
Clinical expertise:
The station editing groups are made up of NHS Consultants, who supervise senior clinical placements, from a wide range of specialities across the course as well as the Clinical Skills facilitators.
The members of the editing groups are involved in writing, reviewing and editing the stations prior to the OSCE but also close the loop after the exam suggesting changes if a station could be improved.
OSCE stations with generic themes such as radiology, microbiology, haematology/biochemistry are reviewed by the relevant clinician to ensure accuracy of the content.
* Previously: "Demonstrate how a range of appropriate stakeholders is involved in the construction of the CPSA. This should include patient representatives, people with current experience of clinical practice at the relevant level in the NHS in the UK, and access to expert advice as needed."
30

Evidence provided (document names and hyperlinks)
· 2019 FPE OSCE editing groups (members, specialty and education role ­ block lead/clinical teacher)
· Evidence of development / evolvement of an OSCE station (showing iterative versions, pilots and feedback following the station being used `live'
37 Demonstrate that there are policies and procedures in place to deal with all aspects of the CPSA, including roles and responsibilities of key staff, mitigating circumstances, reasonable adjustments, illness on the day, appeals process and unexpected incidents around the time of the CPSA.
Suggested evidence i Written policies and standard operating procedure (SOP) for the CPSA (eg roles and responsibilities of key staff, reasonable adjustments, illness on the day, appeals process and unexpected incidents around the time of the CPSA, or this information in a Code of Practice for Assessment).
Narrative
· We have a number of documents outlining the different roles on the day. All team members are trained so anyone can undertake them including:
o OSCE administration lead who oversee the running of the OSCE
o Station Monitor leads who oversees the timing of their individual OSCE station
o Clinical OSCE leads who oversee any clinical queries
o Clinical OSCE peer reviewer as a double marker (for quality control of marking)
· If a candidate feels that their OSCE performance may be compromised, they need to complete a mitigating circumstances form before the exam or within 7 days of the exam if the event happens during the examination time (Medical School Website).
· By presenting themselves for the exam candidates are confirming they are fit to be assessed. If a candidate then confirms they are unwell we provide details of what to do.
· If a candidate has a disability and may need Alternative Examination Arrangements (AEA) they complete a form found on the Medical School website. This is then considered by the AEA panel prior to the OSCE. Each case is assessed
31

on a case by case basis by a specialist in reasonable adjustments prior to the exam. Any adjustments to the exam arrangements can then be communicated through the Assessment Office.
· It is anticipated that the OSCE examinations will proceed without incident. However, if there are any perceived examination irregularities noted by students, patients, simulated patients or examiners, there are processes in place for them to be reported on the day of the examination to a senior member of staff. Students will be asked to sign a form at the end of the OSCE examination and before leaving the examination building to confirm that either irregularities have been reported, or that no irregularities occurred during their assessment. These irregularities are then discussed at the Year 5 Assessment Review Group.
· There is detailed information available regarding our appeals and misconduct process.
· Any incident during the OSCE that may disadvantage the student is investigated in real time by the OSCE administration lead and the OSCE clinical lead. This is to ensure transparency. Depending on the nature of the incident, the OSCE clinical lead may peer review in a station or the incident will be followed up after the OSCE is completed.
Evidence provided (document names and hyperlinks)
· SOP for Station Monitor instructions
· SOP for Clinical OSCE lead responsibilities
· SOP for OSCE peer reviewing
· 2019 example collated report of OSCE incidents from students and staff
· Case example from the AEA panel detailing the adjustments that were agreed for a student prior to the OSCE exam
· Case example from the AEA panel detailing the adjustments that were agreed for a student prior to the OSCE exam
· Section from the Code of Practice for Assessment detailing the Appeals Process document
Content sampling The MLA content map will be informed by Outcomes for graduates, the Foundation Programme training outcomes, the Generic professional capabilities framework and
32

Good medical practice.*
38 Show how the sampling strategy relates to the MLA content map.
Suggested evidence i Current: sampling strategy for CPSA and CPSA blueprint.
ii Future: how current individual CPSA blueprints relate to the MLA content map.
Narrative
As the FPE OSCE exam runs on a yearly basis, we have a sampling strategy aimed at ensuring the exam is unpredictable for candidates while maintaining a balanced diet. The Year 5 OSCE working group have selected seven mandatory topics which they feel need to be addressed in each administration due to the risk they present to patients or known risks to Foundation doctors. Within these other 12 stations domains are sampled fluidly to provide a robust test of knowledge and skills.
Current: We map to the current blueprint to ensure the correct number of mandatory topics and domains are used. The seven mandatory topics are:
1. Cancer 2. Primary Care 3. Seriously ill patient 4. Integrative Care 5. Child health 6. Reproductive 7. Ethical & Professional The other 12 stations are selected to ensure that there are six stations primarily from each of the three domains:
· Observed History Taking · Observed Clinical Examination · Interpretation of Investigations · Developing a Management Plan · Procedural Skills · Prescribing · Problem solving
* The MLA content map will be published in Q3 2019.
33

· Patient Safety · Professionalism · Communication skills
Prior to deployment the assessment goes through a series of clerical checks to ensure that there are a balance of patients and scenarios.
Future: We will map to the new MLA content map once we have it.
Evidence provided (document names and hyperlinks)
Past 3 years of the FPE OSCE main-sit and re-sit blueprint (2017,2018 and 2019)
39 Demonstrate how the quality, security and currency of the assessment content is maintained.
Suggested evidence i Narrative explaining how quality, security and currency are achieved, including details of station development, who is involved, training events, the process for station usage/review/revision/storage and sharing.
Narrative
Quality:
· The Station Writers Group oversees the creation and editing of stations. Each member meets with the Lead for Clinical Assessment every year to discuss potential station topics and design based on the learning outcomes.
· New OSCE stations have a mock run through with a simulator, FY2 and examiner to test for accuracy and these pilots are conducted as part of the wider OSCE editing group meetings.
· Feedback from exam days, including from the station writer, is used to edit station. Best practice is to do this on the day of the exam when the OSCE station is live, however this process is completed within 2 weeks of the exam.
· Stations are reviewed on an ongoing process and they are reviewed at each stage in the development process. The station states when it was last used indicating when the last editing occurred.
Security:
· We have dedicated space that is available for OSCEs at our partner teaching
34

hospital which meets all the modern information security requirements.
· We do not share any OSCE station material via email to anyone outside The Assessment Team (station writers are included here.) All electronic OSCE station documents are password protected.
· Candidates presenting on the day of the OSCE in the morning and afternoon hand in all electronic devices prior to starting the circuit. The afternoon candidates sit in a lecture theatre until the morning candidates are released so no form of communication can take place.
· Students are quarantined so that there can be no possible transfer of information between those sitting the exams in the morning and the afternoon. This is accepted by students as part of the requirement of the CPSA.
· OSCE stations are changed for each day of the exam.
Training:
· Any Consultants involved in supervising senior clinical placements are encouraged to write stations. New writers meet with the Lead for Clinical Assessment and are trained in how to write an OSCE station, with examples provided at the face to face meetings.
Involvement:
· All stakeholders are involved at some stage.
Patient representatives:
Role players attend the OSCE editing sessions for any new stations. They also review the OSCEs prior to attending the simulator OSCE training and suggest edits to the stations to improve them. They have been the driver behind introducing diversity into the OSCE circuits over the past two years, for example same sex OSCE stations.
Clinical expertise:
The station editing groups are made up of NHS Consultants from a wide range of specialities across the course as well as the Clinical Skills facilitators.
The members of the editing groups are involved in writing, reviewing and editing the stations prior to the OSCE but also close the loop after the exam suggesting changes if a station could be improved.
OSCE stations with generic themes such as radiology, microbiology, haematology/biochemistry are reviewed by the relevant clinician to ensure accuracy of
35

the content. Usage:
· All OSCE stations are logged on the X drive in the FPE OSCE station bank. This includes when they have been previously used or whether they are new, who the author is and when they were written. They are not logged into the X drive until they have been through an OSCE editing Group.
Sharing: · We have shared a few stations with the Medical School Assessment Alliance
Monitoring: · An OSCE station is not used two years in a row and is reviewed, in the OSCE editing group, if it is to be included in the next OSCE exam.
Evidence provided (document names and hyperlinks) Screenshot of the X drive with FPE OSCE bank
CPSA design 40 Demonstrate the rationale for the assessment approach used for the CPSA. This
should include: j format (OSCE, OSLER, MOSLER, PACES etc.) k station design l testing time, including number and duration of stations m approach to scoring candidate performance.
Suggested evidence i Description and explanation of the philosophy underpinning the construction, design/scoring and delivery of the CPSA with respect to the overall assessment strategy.
36

Narrative The assessment exists to allow candidates to demonstrate that they have the knowledge and skills to enter the UK Foundation programme in the UK. The OSCE is designed around scenarios that a doctor may experience while working as in the Foundation Programme. For example the Professional stations address important and difficult conservations and ethical issues they may encounter as an FY1. Scenarios may be drawn from a range of clinical settings as per the blueprint. The stations are sampled fluidly to reduce predictability. However, certain areas of risk are always sampled e.g. clinical reasoning, clinical skills and prescribing. Station design is based on a realistic integration of skills in scenarios that are designed to be as realistic as we can simulate. We are looking broadly at an individual's ability to perform at the level of an FY1 and so some focus is paid to communication and ethics. Format: OSCE Station design: We use a combination of real patients and simulators actors. There is one examiner in each room. Test times: Nine stations, with different testing times (we physically are unable to add more
stations) The complex stations testing various domains are longer and range from 20mins to 25
mins The stations with less complexity are 10mins. One minute reading time per station with a total testing time of 150mins. We are presently re-designing the Year 3,4 and 5 OSCE where the stations will be a combination of 10 or 20 minutes Scoring: Checklist and global scores by examiners
37

Evidence provided (document names and hyperlinks) 2019 FPE OSCE circuit slide from the briefing 2019 FPE marksheet for station 2019 CoP Assessment explaining the pass mark 41 Demonstrate how station writers are trained (eg through station writing workshops
run by members of the faculty who are experienced station writers).
Suggested evidence i Details of the training programme for new station writers (eg who leads this and what happens on it), and how these skills are refreshed.
Narrative · Any Consultants supervising senior clinical placements are encouraged to write stations. New writers meet with the Lead for Clinical Assessment and are trained in how to write an OSCE station, with examples provided at the face to face meetings. · The work of new writers is reviewed and individual feedback is provided initially by the Lead for Clinical Assessment and then in the subsequent editing groups. · Ongoing feedback is also provided to all associates about their stations. · We have a style guide for station writers to use and follow. · There is also a blank template for station writers to use when writing new stations.
Evidence provided (document names and hyperlinks) Blank OSCE template Style guide for station writers
38

42 Demonstrate how feedback from examiners, exam board and station level metrics feed back into the station writing process (see also 53).
Suggested evidence i Explanation of process, including examples of metrics and feedback, and the revisions made to stations. ii Case study showing the lifecycle of a station.
Narrative We pilot all stations before first use. We then gather feedback from candidates, role players and examiners. The stations are then amended prior to being included. Our psychometric lead produces station level metrics in advance of the Assessment Review Group Meeting and the subsequent Panel. They review item performance based upon these and select items for discussion with the panel based on the metrics. Any OSCE station that flags is reviewed in more detail and if necessary the OSCE station is removed from the OSCE bank until suitable revises can be made. Feedback from examiners and role players is considered on the day of the exam and the Chief Examiner decides upon any necessary edits which are implemented concurrently. Evidence provided (document names and hyperlinks) An example of a life cycle of a station:
· Station 3 · Date written: June 2018 · Author: JA · Edited: September 2018 · First Used: FPE OSCE January 2019 · Rested on: FPE OSCE 2020 · Back in bank: FPE OSCE 2021
Feedback form from an examiner and simulated patient regarding an OSCE station
Spreadsheet listing when and who wrote all the FPE OSCE stations
39

Standard setting 43 Demonstrate how standards are set and the underlying rationale for the chosen
method, including:
n standard setting method at both station and overall assessment level
o any additional passing criteria (eg minimum number of stations passed). *
Suggested evidence i Detailed description of standard setting method, both within and across stations (including approaches to compensation within the CPSA or more widely), and the process and timeline by which this is applied, including who is involved and what training they undertake to support standard setting activity.
ii Rationale for the method and explanation of any modifications.
iii Description and rationale for any additional standard setting criteria, eg use of SEM.
Narrative
The Borderline Group Regression (BGR) method has been used to set the primary standard for the FPE OSCE (CPSA) since 2014. The BGR method uses a global rating of a student's performance provided by the station examiner(s). The student will be rated at each station assessed by an examiner. The data from which the primary standard for the examination is derived is therefore collected during the examination. The student's performance will be rated by the examiner on a five point global rating scale
The primary standard is therefore based both on the cut score determined by BGR and on the expert judgements of the examiners. All individuals who make these performance judgements will therefore receive on-going training in order to ensure reproducibility of standards and maximise examiner homogeneity.
Once the standard setting using the BGR method has been completed, a decision on whether a student's performance is satisfactory or unsatisfactory will be based on the following two criteria:
· Overall pass score: A student must achieve the overall pass mark for the

* Previously: "Demonstrate how standards are set and why, including:

a

standard setting method at both station and overall assessment level

b

any additional passing criteria (eg minimum number of stations passed)."

40

examination (this ensures a sufficiently high standard), based on total station pass scores · Conjunctive standard: A student must clearly pass a minimum number of stations, based on an examiner's global grading (this ensures breadth of competence and limits compensation). These conjunctive criteria are agreed by the Board of Examiners with the advice of the Assessment Group. A student must meet both criteria to be graded as Satisfactory for the examination. Performance in the FPE also counts to the award of MBChB (Honours). Our standard setting is applied to statistical packages from a Microsoft spreadsheet. At the Assessment Review Group Meeting the pass mark is discussion which is the summed cut score from the borderline regression of all the stations plus the conjugate of looking at the demerit scores. Evidence provided (document names and hyperlinks) Section from the Code of Practice of Assessment Psychometric report for the 2019 FPE OSCE
41

Examiners We encourage the inclusion of multi-professional, lay and training grade examiners. Professionally qualified examiners should be in good standing with the relevant regulatory body.
44 Demonstrate how examiners are recruited and trained. This should include:
p equality and diversity (E&D) training
q training events before the day of the CPSA covering examiner conduct, awareness of bias and scoring guidance
r calibration exercises to ensure that examiners have a common approach to identifying different levels of performance, especially borderline candidates.
Suggested evidence i Details, timings and participation requirements for all events, including any lay examiners.
ii Exemplar materials, including examples of good and poor conduct, bias and scoring guidance to support consistency in making global judgements.
iii Details of how examiner performance is monitored and feedback given, covering appropriate examiner professional behaviours and marking behaviours (eg examiners who may be consistently `hawks' or `doves').
Narrative
· Any consultant or GP who engages with the clinical component of the MBChB course can examine at the FPE OSCE (CPSA), this includes recently retired doctors who regularly examined (retired doctors can examine up to five years after their last revalidation date). All examiners since 2014 have attended a face to face examiner teaching session. An online examiner refresher course is undertaken every 4 years however examiners have the option to attend another face to face session if they prefer.
· ST6 trainees and above can examine as well
42

· The face to face examiner training is a 3 hour interactive session covering examiner behaviour, unconscious bias, the principles of assessment and videos of mock OSCE stations where the examiner can mark and award a global score. This is through live voting software (Turning Point), followed by a discussion.
· Following the completion of training examiners are allocated an examiner number and placed on the examiner database
· Prior to attending the OSCE they have to provide evidence that their Equality and Diversity training is up to date
· The examiner briefing delivered on the day of the FPE OSCE (CPSA) gives a reminder of the purpose of the exam and the examiner's role on the day. It also covers the marking system used including a reminder of the global scores.
· Examiners are monitored by a Senior Member of the Medical School who peer reviews the marking of OSCE stations on an IPad.
· Any examiner who flags due to behaviour is discussed at the Assessment Review Group and subsequent panel and an action plan agreed.
Evidence provided (document names and hyperlinks) · Examiner training PowerPoint slides · Examiner training word document · Screenshot showing when an individual examiner was trained and when they need to refresh · 2019 online Medicine and Surgical online refresher training · Google form showing 12 Examiner Bloopers to focus examiner behaviour · 2019 FPE OSCE Examiner briefing
43

Simulated/real patients 45 Demonstrate that candidates can identify and interpret clinical findings.
Suggested evidence i Evidence that within the overall CPSA blueprint these clinical skills are assessed and that candidates demonstrate a level of competence across all domains of the content map.*
Narrative In the 2019 FPE OSCE Stations 1 and 2 have real patients with clinical signs Station 1 is a Chronic Medical Station and includes patients with either a cardiovascular, respiratory, gastrointestinal, neurology, endocrine, diabetes and rheumatology condition. Station 2 is a Cancer care Station and includes patients with cancers including breast, colorectal, head and neck, hepatobiliary and pancreatic, lung, lymphatic, skin, myeloma, upper gastrointestinal and urological. We also recruit patients with cancer related conditions and signs including malignant spinal cord compression, superior vena cava obstruction and splenomegaly. Clinical skills are assessed as part of the Acute Care station and the Obs and Gynae stations. We utilise an ALS mannequin when assessing ILS OSCE stations. We utilise low fidelity simulations such as photographs, blood results in the GP, Integrative Care, Obs and Gynae and Child Health OSCE stations We regularly use the following manikins:
· Pregnancy · Vaginal · Urinary Each of these can be utilised to simulate real clinical findings as a part of the station.
* Previously: "Evidence that within the overall CPSA blueprint these clinical skills are assessed and that candidates demonstrate a level of competence (not cross-compensated by other skills such as communication)."
44

Evidence provided (document names and hyperlinks)
· Marksheet for 2019 real patient station
· Marksheet for acute care station (clinical findings section)
· Marksheet for O&G requiring interpretation of clinical findings (e.g. scan result) 46 Describe the role of simulated/real patients in the CPSA (eg scoring) and
demonstrate how they are recruited, trained, calibrated and debriefed, including E&D training as appropriate.
Suggested evidence i Narrative detailing the involvement of simulated/real patients in the CPSA, how they are prepared for this, and if they provide any scores/feedback, how this contributes to assessment outcomes.*
Narrative
The simulated patients/role players take on the role of a patient, by playing to a written scenario and responding to the candidates' questions and information. The scenario contains background details of the character, an opening statement, information to give to the candidates (either freely or if specifically asked), questions to ask, notes on behaviour/demeanour, and details of any examination required in the station.
The role players are recruited through the Simulation Education manager and trained a week before the FPE OSCE by a range of experienced clinicians in their field.
All role players attend a session on the day to explain how the OSCE circuit works and they are trained in marking, awarding global scores and giving feedback.
The role players mark independent in 4 stations in the FPE OSCE including GP, Integrative Care, Professionalism Communication and Professionalism Ethics.
The role players calibrate their roles with their examiner prior to the exam itself and agree the level of emotion and how to play the station.
One role player facilitators are present for each circuit. They run through a morning briefing with the role players ensure calibration/standardisation is carried out and deal with any role player issues throughout the day.
Role players are monitored (by the examiners and station monitors) in terms of time
* Previously: "Narrative detailing the involvement of simulated/real patients in the CPSA, how they are prepared for this, and how any scores/feedback they provide contribute to assessment outcomes."
45

management, professionalism and attitude as well as accurate playing of the scenarios, and any issues are dealt with swiftly, which could mean a conversation with a role player on the day of the OSCE.
Role players complete a Medical School feedback form on which they can provide their own feedback of any issues, such as the scenario, the examiner or a candidate (eg. if there was a problem or some script suggestions on the day).
Real patients do not mark students directly. Their views are sought by the examiner but the actual marking is decided by the examiner.
Examiners have to provide evidence of their E&D training when they volunteer to examine. A record is kept on their examiner profile at the Medical School.
Evidence provided (document names and hyperlinks)
2019 FPE OSCE simulator training session
External examiners 47 Show that there is a structured pro forma to guide external examiners to provide
feedback (see also 55).
Suggested evidence i Example of completed external examiner report.
Narrative
An external examiner attended the 2019 FPE OSCE
Evidence provided (document names and hyperlinks)
The documents the external examiner received prior to the FPE OSCE
On appointment each external examiner will be sent: · The link to the guidance provided by the Quality Office External Examiners which includes the External Examining Handbook and online training · a copy of the most recent curriculum documents, which include information about the philosophy, educational principles, structure and detailed aims and learning outcomes of the curriculum · a written description of the role of external examiners within the MB ChB course · a copy of the code of practice for assessment of students · in addition, a briefing meeting will be held in advance of the main summative
46

examinations
A copy of the 2019 Leicester Medical School handbook for external examiners (updated annually) An example of an external examiner report from the FPE OSCE (with ex examiner identity removed)
During the CPSA
Resources and space 48 Show that the CPSA takes place in a suitable, secure space with access to
appropriate resources.
Suggested evidence i All resource details. Could include map/photographs/video of circuit and inventory of resources, as well as external examiner comments/observations regarding suitability of assessment environment.
Narrative We have dedicated space at our partner teaching hospital that is available for the delivery of the FPE OSCE (CPSA). There is an upstairs component and a downstairs component. The space is adaptable. Downstairs There is a registration area for examiners There are 10 single rooms with 2 further areas which can accommodate 12 candidates altogether. There is a room with a projector screen and chairs for the lunchtime examiner briefing and a separate room for patients to relax in, which is situated beside the toilet. The simulated patient room is also situated downstairs where they can relax independently from the patients and the examiners. Upstairs There are separate areas upstairs that can accommodate between 3-6 students in each
47

OSCE station at a time.
Both circuits are manned by Station monitors and an OSCE lead to ensure that candidates move on to the correct station.
There is a technician room off the circuit for storage of equipment. There is additional storage off the circuit.
Evidence provided (document names and hyperlinks)
Floor map of the circuit
Photographs of the spaces set up for an exam
Familiarisation 49 Demonstrate that the examiners and simulated patients are appropriately briefed
around the time of the CPSA and have familiarised themselves with the station content relevant to their role, including rehearsing the station together.
Suggested evidence i Evidence that briefing and station level familiarisation proximal to the exam are undertaken, eg evidence from the external examiner's report.
Narrative
Examiners and role players attend a briefing on each day of the FPE OSCE exam they attend. Role players are provided with details of their role for the day 1 week in advance and are trained.
Once each briefing has taken place, the examiners split into their stations were they discuss the station as a group and come to an agreement if there is any uncertainty after discussing with the OSCE floor lead. They then discuss the station individually with the simulator and answer any further questions. Examiners that are examining in a station with a real patient, review the opening statement, the recent outpatient letter and the pro-forma. They then go through the history with the patient and examine then to ensure that any clinical signs are there and can be demonstrated. and role player rehearse their
The examiner and role player/real patient have 30 minutes to calibrate the station prior to the OSCE circuit commencing.
48

Evidence provided (document names and hyperlinks) Evidence from the 2019 external examiners report 2019 CPSA OSCE Examiner briefing (PowerPoint)
50 Demonstrate that the candidates are appropriately briefed on the day of the CPSA.
Suggested evidence i Evidence that briefing proximal to the exam is undertaken, eg evidence from the external examiner's report.
Narrative Candidates receive a standardised briefing delivered by a pre-recorded narrated PowerPoint by the Lead for Clinical Assessment.
Evidence provided (document names and hyperlinks) 2019 FPE OSCE student briefing on the day 2019 External examiners report Accurate data acquisition 51 Demonstrate the approach to accurate and consistent data acquisition, and dealing
with missing data.
Suggested evidence i A description of how scores are captured (eg checklist that is scanned, or tablet input), and of processes in place to ensure scores are accurate and complete (eg checks at the end of each session).
Narrative Scores are captured on paper manually (not OMR) Each station monitor lead checks for missing marks as each candidate completes each station. Examiner with missing marks are identified immediately after the student leaves the station and asked to fill in the missing mark/s as soon as possible, while it is fresh in their memory. If at a later stage a missing mark is identified the medical schools position is that is it
49

has not been filled out then a score should be awarded as if the candidate did perform it. (students are given the benefit of the doubt) Once papers are counted in at the OSCE venue, the papers are transported to the Medical School for collation. A team of administrative staff input the data into a spreadsheet and each person's work is then checked by another member of staff for accuracy. The process takes three days. We are planning to introduce an iPad system within 2 years. Evidence provided (document names and hyperlinks) Document explaining how marks are checked in the FPE OSCE Evidence of how many missing marks needed to be awarded across an FPE exam diet
After the CPSA
Exam board 52 Demonstrate how the assessment provider ensures the data presented to the exam
board are correct (eg cross checking, managing missing data).
Suggested evidence i Narrative describing the process between the completion of the CPSA, and the exam board, including who is involved, what their responsibilities are, and what checks are in place to ensure accurate handling of data and decision-making.
Narrative Following the FPE OSCE marks are cross checked manually, missing marks and global scores are awarded before entering the data into a spreadsheet by the OSCE administration team. Once complete the spreadsheet is sent to the psychometric lead for analysis. Results are discussed at the Assessment Review Group to identity if any OSCE station has not worked and therefore may need to be removed from the OSCE circuit. If this is the case the cut score is recalculated and the de-merit level is re-set, before
50

presenting the final data set at the Panel.
Evidence provided (document names and hyperlinks) Document explaining how the data presented to the exam board is correct ­ relevant section in the Code of Assessment.
53 Demonstrate how assessment performance is analysed post-exam by a group with appropriate expertise, including looking at factors such as the performance of examiners, stations, simulated patients and examination sites, and how the data feed back into a quality cycle.
Suggested evidence i An account of the analyses that are carried out and how they are used to influence further station/CPSA development.
Narrative Presently we broadly can look at how an OSCE station has performed over the 3 days and if it flags we look in more detail to ascertain whether it was a station on a particular day. This is then reviewed by senior members of the Assessment Team and a decision taken whether to remove it from the bank completely or remove it from the bank until it gets revised and goes through another editing group. We do not have the capability or the resource presently to look at the performance of individual examiners We have not analysed simulator performance but are planning to. Evidence provided (document names and hyperlinks) Individual OSCE station psychometrics (post hoc analysis) Minutes from the 2019 FPE Panel detailing this discussion
51

54 Demonstrate how unprofessional candidate, examiner and simulated patient behaviour during the CPSA is captured and dealt with (eg cause for concern/yellow card).
Suggested evidence i Show a clear policy for logging and addressing concerns relating to unprofessional behaviours. Provide evidence of how these data are used in determining outcomes of assessments.
Narrative Within each OSCE mark sheet there is a section for the examiner to document any cause for concern. All the concerns are then documented in a spreadsheet and reviewed by Senior Members of the Assessment Team. The justified concerns are then feedback to the students. After each OSCE circuit the students are de-briefed where they have the opportunity to report are unprofessional examiner or simulator behaviour. The examiners can also raise concerns during the OSCE regarding simulator behaviour, which are addressed in real time. Peer review and external examiner review occurs during the examination to directly observe the behaviour of examiners and simulators. A log is kept of which examiners have been reviewed during each exam diet. Any professionalism issues raised with examiners and simulators are discussed and actioned at the Assessment Group meeting Evidence provided (document names and hyperlink
· 2019 FPE OSCE causes for concern spreadsheet (anonymised) · 2019 Peer review summary · 2019 External examiner report · Case study of unprofessional behaviour from an examiner
52

External examiners 55 Describe the role of the external examiner at the exam board and how assessment
providers respond to the external examiner's report.
Suggested evidence i Provide records of external examiners' reports and the formal institutional response to them. ii Where outcomes/actions are identified, demonstrate how they are addressed.
Narrative External examiners are invited to be present at all examiners' meetings at which significant decisions are to be taken including the Panel of Examiners. An external examiner should be present, or available for telephone consultation, at Board of Examiner meetings where award decisions are made.
Evidence provided (document names and hyperlinks) 2019 FPE OSCE external examiner report and the University response to them
Results and feedback to candidates 56 Demonstrate what individual results (pass/fail, grades) and feedback (eg
narratives) are given to candidates and why.
Suggested evidence i Description of information provided to candidates, including results and feedback. ii Rationale for why these methods of feedback have been chosen.
Narrative One document is released to the students outlining the breakdown of grades awarded per station and their overall result (unsatisfactory, satisfactory, merit and distinction.) All written feedback from all the OSCE examiners is scanned and released to the student as a PDF document at the same time
53

Evidence provided (document names and hyperlinks) PDF for a student in the 2019 FPE OSCE (anonymised) Feedback to examiners and simulated patients 57 Demonstrate what feedback is given to examiners and simulated patients and why. Suggested evidence
i Description of feedback provided to examiners and simulated patients. ii Rationale for why these methods of feedback have been chosen. iii Show how these data are used to improve examiner/SP performance. Narrative We presently do no provide feedback to the examiners and simulated patients. However we are planning to implement this in the next 3 years. We are currently exploring how technology can be utilised. Evidence provided (document names and hyperlinks) None at this stage.
54

Email: gmc@gmc-uk.org Website: www.gmc-uk.org Telephone: 0161 923 6602
General Medical Council, Regent's Place, 350 Euston Road, London NW1 3JN.
Textphone: please dial the prefix 18001 then 0161 923 6602 to use the Text Relay service

Join the conversation

@gmcuk

facebook.com/gmcuk

linkd.in/gmcuk

youtube.com/gmcuktv

To ask for this publication in another format or language, please call us on 0161 923 6602 or email us at publications@gmc-uk.org.
Published September 2019 © 2019 General Medical Council
The text of this document may be reproduced free of charge in any format or medium providing it is reproduced accurately and not in a misleading context. The material must be acknowledged as GMC copyright and the document title specified.
The GMC is a charity registered in England and Wales (1089278) and Scotland (SC037750).
Code: GMC/MLACPS/0919


Adobe InDesign CC 14.0 (Macintosh) Adobe PDF Library 15.0