Article Text
Abstract
Assessment of competencies in rheumatology is difficult, but possible, and is an important part of the evaluation of practising clinicians, helping to prevent poor performance. Competencies are currently assessed by the Royal College of Physicians, the General Medical Council, and the National Clinical Assessment Authority.
- BOF, best of five
- DOPS, direct observation of procedural skills
- EMQs, extended matching questions
- GMC, General Medical Council
- MCQs, multiple choice questions
- NCAA, National Clinical Assessment Authority
- OSCE, objective structured clinical examination
- PACES, practical assessment of clinical examination skills
- PLAB, Professional and Linguistic Assessments Board
- Competencies
- assessment
Statistics from Altmetric.com
- BOF, best of five
- DOPS, direct observation of procedural skills
- EMQs, extended matching questions
- GMC, General Medical Council
- MCQs, multiple choice questions
- NCAA, National Clinical Assessment Authority
- OSCE, objective structured clinical examination
- PACES, practical assessment of clinical examination skills
- PLAB, Professional and Linguistic Assessments Board
The term competence implies what a doctor should be able to do (knowledge and skills), and performance means what a doctor does in real clinical practice (knowledge, skills, and attitudes, including behaviour). In this review methods of assessment of clinical competence will be outlined, some innovations currently in use in the UK will be discussed, and the assessment of clinical performance in the UK will be reviewed. The theme is that the assessment of competencies in rheumatology is difficult, but possible.
The need to assess competencies in UK rheumatologists has come from a requirement to be overt and demonstrate that we have high standards of practice. There has been an increase in public demand for good quality doctors led by problems in Bristol,1 and Alder Hey2 and because of doctors who have come to the attention of the General Medical Council (GMC) in a public manner, such as Harold Shipman and Rodney Ledward. In addition, the Calman Specialist Registrar programmes have resulted in less time for trainee rheumatologists to develop their skills.3
One of the major problems with registrar training has been the implementation of the European working time directive.4 This has meant a reduction in trainee’s working hours in rheumatology.5 Trainees gain less experience on call and in the hospital and have less continuity of care because of working in shifts, resulting in less time for teaching and learning. Can competence be acquired and maintained under these circumstances?
The assessment of competence can be divided into assessment of knowledge, assessment of problem solving, assessment of skills, and assessment of attitudes. This was described by Miller, and is referred to as the Miller triangle,6 with the verbal descriptions of “knows → knows how → shows how → does” representing the developmental stages of acquisition of knowledge → problem solving → skills → attitudes and behaviour.
To understand assessment it is necessary to have some definitions of assessment terminology. The validity of an assessment is whether the assessment measures what it is supposed to measure. The reliability of an assessment is whether it is a consistent measure and whether the sample of activity that is being assessed is large enough. The practicability of an assessment measures whether the assessment can be carried out with the available resources.
Table 1 shows some methods of assessment.
TESTS OF KNOWLEDGE
The commonest test of knowledge that is used is the multiple true-false multiple choice question (MCQ). These tests allow a large amount of knowledge to be assessed and are reliable7 but not valid as they do not relate to what our candidates do in practice. However, they are relatively easy to administer and require few resources.
“MCQs are not valid because they do not relate to practice”
Extended matching questions (EMQs) and best of five (BOF) questions are variants of the multiple true-false MCQ that allow problem solving to be assessed in addition to knowledge. In EMQs, a list of possible answers is provided and candidates match the correct answer to the clinical scenarios provided. In BOF style questions, only one of the five options is correct. These question styles reduce the cueing effect of standard MCQs. BOF questions are now the preferred format for MRCP part 1 and 2 written examinations.
TESTS OF SKILL
Tests of skill should reflect real practice. The traditional unobserved long case examination is highly unreliable and provides an inaccurate picture of candidate strengths, weaknesses and overall competence, especially if only one case is being assessed. Reliability can be increased by assessing more cases, using several examiners, and directly observing the candidate.
The objective structured clinical examination (OSCE) was developed by Harden and Gleeson.8
This is a circuit of stations during which students perform standardised tasks and examiners mark against structured rating forms. It is flexible, reasonably valid, reliable, and quite practical. A technique such as the GALS Screen9 is easy to assess using this style of examination.10 The OSCE is a valid and reliable form of assessment provided that there are sufficient numbers of stations. However, it can be a logistical nightmare to administer. Many people are needed to make the examination run smoothly, especially if large numbers of candidates are being examined. Cost effectiveness and the space required are also problems.
In postgraduate assessment the OSCE is used for the Professional and Linguistic Assessments Board examination (PLAB), run by the General Medical Council. This is a test of clinical competence in doctors who have not trained in the European Economic Community who wish to practise in the United Kingdom.11
TESTS OF COMPETENCE
To assess the competencies of rheumatologists Aeschlimann and colleagues have developed the EULAR multiple choice question game,12 a knowledge and problem solving test. The reliability of this multiple choice questionnaire is high and it has been shown to be helpful in the training and assessment of competence of rheumatologists in Europe. It is feasible to extrapolate knowledge assessments to computer based methods, but there are problems with probity as the identity of candidates is difficult to verify in computer based tests.
In an observed long case13 the candidate has standard instructions and takes a history from, and examines, a patient. Examiners observe and do not interact with the candidate and there is a structured marking schedule. Standardised patients can be used14 to ensure consistency in this style of assessment.
The structured long case has been developed at Leicester University Medical School in the UK.15 During this procedure the candidates take a history with a standard instruction, and examiners mark according to an objective marking schedule. There is then a targeted viva with questioning relating to the case, usually involving a problem list and management. This method of skills assessment could be easily applied to patients with chronic rheumatic conditions such as rheumatoid arthritis.
In postgraduate assessment the OSCE is used for the PLAB examination. It is a 14 station test which is written to a blueprint and integrated between clinical specialties. More than 1000 candidates take it each year16; it includes some rheumatology stations such as examination of a hip.
The practical assessment of clinical and examination skills (PACES)17 is the new clinical examination for the MRCP. It is a circuit of five stations and includes observed history taking and observed communication skills, on a standard marking schedule and in addition to clinical system examinations; the examination is computer marked. Detailed feedback is given to candidates who fail. This examination was implemented in June 2001 and has been well received.
Other new methods being evaluated by the Royal College of Physicians include 360 degree appraisal, DOPS, and the mini-CEX. DOPS is direct observations of procedural skills—observation of a skill against a standard marking schedule. The mini-CEX is a series of approximately six observed real clinical encounters, which are observed and assessed against standard criteria.18 360 degree assessment is a method whereby a number of colleagues (12 or more) complete a rating form giving an assessment on a verbal scale (for example, 1–10) to describe aspects of communication skills, clinical skills, and behaviour.19 The colleagues included can come from an allied health profession or clinical background. These tests are useful summative methods to assess trainees while they are working in real time in real clinical situations.
In addition to assessment, evaluation of rheumatologists’ competence has also been done by appraisal.20 Since December 2000, the Department of Health, BMA, and GMC have recommended that all consultants should be appraised. Appraisal comprises the compilation of a portfolio of activity to include such information as patient satisfaction questionnaires, hospital statistics on outcomes, and evidence of participation in educational activity. Each consultant rheumatologist is expected to have an annual interview where these issues are discussed and recorded and to document a personal and professional development plan. Hospital trusts in the UK are now penalised for not achieving appraisal targets.
WHAT COMPETENCIES ARE REQUIRED?
How should we decide what specific competencies a rheumatologist needs? There appears to be no clear consensus. In a survey of 173 training centres in Europe,21 there was some harmony in the requirements for a rheumatologist in areas such as the acquisition of clinical experience and knowledge, the ability to manage patients in a cost effective way, and the ability to promote shared decision making. However there was still considerable diversity in areas such as the inclusion of training in electrophysiology techniques, and some more complex procedures such as epidural injection. This should be improved with the development of a European curriculum for rheumatology22 accessible on the EULAR website. To sample the curriculum by assessment, rheumatologists need to devise a blueprint (or a grid of subject areas and competencies) to define which components of competence need to be tested.
CURRENT ASSESSMENTS OF PERFORMANCE IN THE UK
The most well established assessment of performance in the UK was designed by the GMC. The GMC performance procedures were set up after an Act of Parliament in 1997.23 Before this time doctors could only be removed from the medical register because of poor health or misconduct. Since 1997 poor performance has been used as a reason to remove registration. Performance assessment is by peer review in a two phase process. Phase 1 includes a portfolio of activity, a workplace visit with interviews with colleagues to establish what the doctor is like in his/her own practice, a case note review, and an observation of practice. Phase 2 is a knowledge and skills test, which comprises an extended matching questionnaire (EMQ) covering general medicine and rheumatology, with a section devoted to case histories specifically about rheumatology patients. In the UK one rheumatologist so far has been assessed for performance after a conduct problem. He was removed from the medical register. Peer review failed him in a test of knowledge and peer observation failed him in a skills test.
CAN WE DETECT LESS COMPETENT DOCTORS?
The GMC has been compiling validation data which suggest that most doctors practise to a similar standard so it is possible to detect the less competent doctors—fig 1 shows box plots demonstrating that the majority of our colleagues’ scores are within the box, with a very small number of outliers.
Figure 2 demonstrates the plot of results from the same group of doctors. This shows the relationship between EMQ and OSCE scores. There is a clear outlier—a doctor in the validation group whose performance had not been questioned.
Support for less competent rheumatologists should be provided by a new special health authority the National Clinical Assessment Authority (NCAA).24 This has been set up by the Department of Health to support poorly performing doctors whose problems are not severe enough to be picked up by the GMC. They aim to protect patients by helping the NHS deal with concerns about doctors. The NCAA’s assessment tools are under development and currently being implemented nationally. The authority expects that most referrals to it will be resolved by their support of local procedures. A small number of cases will proceed to an assessment. Assessment includes a knowledge test, a psychological profile of the doctor by occupational psychologists, an occupational health review, a workplace visit, observation of practice, a notes audit, and a clinical skills test. These assessments have been piloted so far, but as yet there is no published information on real cases as numbers are small.
“Less competent doctors can be detected”
Clinical competence and performance may be assessed most effectively only by repeated direct observation of clinical practice.25 A study in Holland has already implemented this as a research tool. In 2001, eight incognito standardised patients visited 27 Dutch rheumatologists who had agreed to the study, but did not know when the patients would be attending or their clinical diagnosis. Results showed a variation in use of resources—those doctors with longer experience in practice ordered fewer tests. This study has shown that it would be possible to observe actual practice of rheumatologists dealing with patients. This method of assessment could be extrapolated to test the competence of those of us in practice in the UK.26
CONCLUSIONS
Assessment is likely to improve a rheumatologist’s competence. Competences can be assessed in rheumatology; they are already being assessed by the College of Physicians, the General Medical Council, and the National Clinical Assessment Authority (NCAA). Assessment of actual practice is feasible, but has not yet happened on a large scale. It is important that we, as a group of professionals, keep abreast of the myriad of regulatory and other bodies that will continue to affect our practice.
Prevention of poor performance is clearly our aim. This means that in the UK we should be active participants in keeping up to date. The supposed “no blame” culture developing in the NHS will facilitate more open use of supportive organisations like the NCAA prevent disciplinary procedures or suspension, and referral to the GMC performance procedures.