Back to Journals » Advances in Medical Education and Practice » Volume 12

Perception of Students and Examiners about Objective Structured Clinical Examination in a Teaching Hospital in Ethiopia

Authors Fisseha H , Desalegn H 

Received 6 October 2021

Accepted for publication 1 December 2021

Published 11 December 2021 Volume 2021:12 Pages 1439—1448

DOI https://doi.org/10.2147/AMEP.S342582

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Prof. Dr. Balakrishnan Nair



Henok Fisseha, Hailemichael Desalegn

Department of Internal Medicine, St. Paul’s Hospital Millennium Medical College, Addis Ababa, Ethiopia

Correspondence: Henok Fisseha Email [email protected]

Introduction: The objective structured clinical examination (OSCE) has become a standard assessment tool in undergraduate medical school training. It is considered an objective assessment of practical skill of students. OSCE is a resource demanding assessment method that can have numerous challenges. Comprehensive assessment of perception regarding OSCE can help identify areas that need improvement. The aim of this study was to assess the perception of students and examiners towards OSCE.
Methods: A cross-sectional study was conducted on students and examiners undertaking OSCE from May 1 to July 30, 2021, using a structured questionnaire. Comparison of variables was done using Mann–Whitney U-Test and Chi-square test. P-value < 0.05 was considered statistically significant.
Results: A total of 141 students and 39 examiners participated in the study. The majority of the students and examiners had a positive response regarding the attributes, structure, organization and validity of OSCE. It was recommended to be used in future exams compared to other assessments by 38.3% of students and 51.3% of examiners. There were certain challenges reported by students and included stressfulness of the exam (51.1%), inadequate time (27.6%), and unsatisfactory orientation (30.5%). One-third of examiners considered it stressful, while 20.5% considered the time provided to be inadequate. Equipment to conduct the exam was considered inadequate by 39.1% and 56.4% of students and examiners, respectively. Around 80.1% of students recommended mock sessions and 23.1% of examiners did not have any prior training on OSCE.
Conclusion: An overall positive perception of OSCE by students and examiners was seen. Certain challenges that need improvements were identified. Continuing evaluation and refinement of OSCE by departments is needed. We recommend further wide-scale national evaluation of the OSCE examination system of medical students.

Keywords: faculty, students, examination, perception, undergraduate medical education, Ethiopia

Introduction

There are several methods to assess clinical skills of medical students. The methods considered to be traditional include short case assessment, long case assessment, multiple choice questions and viva examination. There are also other methods such as essay questions, student projects, constructed response questions, tutor reports, portfolios and log book assessment, to mention a few.1,2 Many of these assessment methods have the risk of being prejudiced and lack objectivity and structure, which is essential during examinations.1,3 With the intention of minimizing these limitations, Harden et al in 1975, introduced the objective structured clinical examination (OSCE), which has now become a standard assessment tool in undergraduate and postgraduate medical school training.4,5 It is also used in other fields such as nursing, dentistry, midwifery, and physiotherapy.6

OSCE consists of multiple stations whereby the examinee is asked to perform a set of defined tasks, such as taking history or performing a focused physical examination in a specified period of time, demonstrating certain skills.2,5 The examinee is expected to practically demonstrate data acquisition and interpretation skills, communication skills, problem-solving skills and procedural skills during OSCE, that would have been only evaluation of theoretical knowledge in most other assessment tools.5,7 Every student is evaluated in the same stations resulting in objectivity.7

OSCE is however not without challenges. It needs an abundant human and material resources, and needs adequate time for preparation.5,8 It also might need to rely on made-up scenarios and subjects that may not accurately reflect real-life scenarios.5 Examiners also need training or experience, and need to standardize their final assessments.5 Even though checklists serve the purpose of standardizing OSCE, their lack of flexibility might be seen as a disadvantage.8 Finding trained standardized patients is both difficult and resource consuming in certain areas.9 There are also challenges with standardizing the setting or station.9 All these difficulties, along with the resource limitations and lack of experience, might cast a challenge for OSCE in developing countries.8 With all the advantages, simple standardization of stations and checklists does not ensure reliability of OSCE.10 It has to be assessed periodically, and inputs from concerned bodies are essential for improving the organization, design and administration of the exam.1,11

Examiners play a crucial role in OSCE. They are responsible for the identification of areas of examination focus, designing stations, conducting the OSCE and providing feedback to students.1 In contrast to students’ perception, examiners’ view has been evaluated less frequently in the past. These studies have made an effort to understand examiners' views, the source of examiner bias and its consequences.1,3,8,12,13

Assessing the insight of both students and examiners will help provide a comprehensive assessment of OSCE, as they both are crucial stakeholders. Therefore, it is important to know the perception of not only students but also examiners on the assessment tool, so that improvements on the quality of OSCE can be made and the benefits that can come from OSCE be attained. Having a reliable assessment tool will help improve the education process by identifying areas on student’s education and performance that needs improvement. Improving the quality of OSCE can help apply it to different medical schools in similar settings.

To our knowledge on the experience from Ethiopia, there are reports on only students’ perspectives from a smaller number of participants.14,15 On the other hand, evaluation of both examinee and examiners’ views has been done, but on smaller number of participants.8 All of these studies included only a single department, rather that the whole institution.

St. Paul’s Hospital Millennium Medical College (SPHMMC) is located in Addis Ababa, Ethiopia. SPHMMC is a tertiary and teaching hospital established in 2007 after a medical college was opened in an existing hospital. It started as a medical school for undergraduate studies but has expanded to postgraduate and fellowship programs. The College initiated Ethiopia’s first integrated modular and hybrid problem-based curriculum for its undergraduate medical education.16 According to the education system of medical schools in Ethiopia, students who complete high school and the subsequent national examination, pass through an undergraduate medical education system without attaining a prior college level of education or degree. During the first two years of medical school, students are provided with education that mainly focuses on basic science education integrated with basic clinical skill teaching. In the next three years, students have clinical practice based education facilitated by clinical faculty members. Depending on the medical school in the country, education before internship lasts from 5 to 6 years. The last year is a period of practice in the form of licensed internship, with supervised rotations in internal medicine, surgery, obstetrics and gynecology, pediatrics, emergency medicine and psychiatry departments.

Despite the passing of many years since the start of OSCE in the current study hospital, no attempt has been made so far to know the students’ and examiners’ perception on OSCE, and this study attempts to fill that gap.

Methods

A cross-sectional study was conducted on students taking OSCE and their examiners at SPHMMC. OSCE has been incorporated into the assessment methods for undergraduate medical students since 2016, after a few academic staff received a training on the conduct and administration of OSCE. Since then, third year medical students in internal medicine, obstetrics and gynecology and fourth year students in psychiatry and ophthalmology attachments are evaluated with OSCE as part of their assessment at the end of each attachment, in addition to traditional evaluations that include long case examination, short case examination, multiple choice question, case-report submission and viva examination.

The OSCE is conducted with a maximum of 10 stations along with rest stations, with time given per station of 5 to 10 minutes. The stations have varying components starting from history taking, demonstration of physical examination skills, interpretation of laboratory or imaging investigations and, performance of procedures, where the student is observed while performing the tasks. Subsequently, there will be stations where the student is allowed to present findings. Depending on the station, standardized patients or mannequins are used. The examiners are physicians who usually have specialty or higher level of training. Other physician faculty members or residents can participate as examiners if the number of specialists is not adequate to conduct the exam.

The study was conducted from May 1 to July 30, 2021 using a questionnaire that is distributed to students and examiners immediately up on completion of their examination.

Sample size was calculated using G*Power to compare differences between variables and independent means.17 A power of 80% was used, with medium effect size (Cohen’s d) of 0.5, and alpha significance level of 5%. This resulted in a sample size of 128 students. A total of 60 examiners were expected to take part in the OSCE examination in the four departments. Since the number of examiners was low, it was planned to include all examiners in the study.

Data collection was done using a self-administered, structured questionnaire after written informed consent was obtained. The student and examiner questionnaires were developed from questionnaires used in previous studies with certain modifications.1,3,8,15,18–20 The student questionnaire collected data on demographic characters of participants, followed by perceptions on attributes, structure, organization and validity of OSCE. It finally included questions on comparison of different methods of assessment. Similar but shortened tool was used for examiners. The questions on perception of OSCE were graded on a 5-point likert scale from “strongly disagree” (score of 1) to “strongly agree” (score of 5).

Data were entered to and analyzed with Statistical Package for the Social Sciences version 25. Descriptive statistical tests using percentages, mean, standard deviation (SD) were used. Internal reliability of the questionnaire was assessed using cronbach’s alpha test. Comparison of variables was done using Mann–Whitney U-Test and Chi-square test. P-value <0.05 was considered statistically significant.

Ethical Consideration

Before conducting the research, permission was obtained from SPHMMC Institutional Review Board.

Results

Out of the 200 students who took the OSCE examination in the study period, 141 students participated in the study resulting in a response rate of 70.5%. Among the participants, 58.2% were male. The mean (SD) age of the students was 22.8 (1.25) years, with participants’ age ranging from 21 to 28. Third year students from internal medicine and obstetrics and gynecology departments accounted for 44% of responses, and the rest 56% were from fourth year students in ophthalmology and psychiatry attachment (Table 1).

Table 1 Characteristics of the Participants

Thirty-nine out of 60 examiners responded to the questionnaire (65%), out of which 25 (64.1%) were male. Mean (SD) age of the examiners was 34.8 (6.6) years and ranged from 27 to 58 years. Examiners from internal medicine department had the highest response, accounting for 21 (53.8%), with only 3 (7.7%) responses from obstetrics and gynecology department. Specialists or sub-specialists accounted for 59% of the examiners (Table 1).

The overall internal reliability of the student questionnaire was high with cronbach’s alpha coefficient of 0.818. An acceptable coefficient of 0.786 was seen with the questionnaire for examiners. No significant improvement was made with removal of any question.

Students’ Perception of OSCE

Attributes and Structure of OSCE

Most of the items assessing attributes of OSCE received positive responses of agreement. For instance, 72.3% of the participants agreed that OSCE revealed their strengths and weaknesses. Similarly, 68.8–69.5% agreed that it assesses wide range of knowledge and skills, and is a good reflection of the medical profession. On the other hand, 39.7–51.1% considered OSCE to be stressful or intimidating (Table 2).

Table 2 Students’ Perception on Attributes and Structure of OSCE

Sixty percent suggested that the exam was well structured and sequenced. Only 56.1% thought the time allocated for the stations was adequate and 43.9% agreed the sequence of stations was appropriate. Mock sessions before exam were seen as essential by 80.1% of the participants (Table 2).

More female students agreed that OSCE reduced chances of failing compared to males (56.1% vs 42.5%) (P = 0.009). Similarly, females reported that instructors were polite and helpful (77.6% vs 66.7%), and instructions of the exam were clear more than males (77.6% vs 67.5%) (P = 0.037 and P = 0.026, respectively).

Organization, Validity and Reliability of OSCE

Agreement with good facilitation of flow of students, availability of organizers and conduciveness of the environment was reported by 73.1–88.6% of participants. Around 30–39% disagreed with the usefulness of the orientation provided and with the quality and availability of the instruments for examination (Table 3).

Table 3 Students’ Perception on Organization, Validity and Reliability of OSCE

Participants had variable perception on the validity of OSCE. While 82.3% agreed with OSCE as a good measure of competency, only around half agreed that it truly measures clinical skills. Similarly, only half perceive it as a standardized exam that gender, personality and ethnicity do not affect (Table 3).

Females were also more likely than males to report higher mean scores of agreement on organization of OSCE questions. This was particularly true for the questions on facilitation of flow of students between stations (94.8% vs 85.2%), organizer availability (86.2% vs 74.1%), and availability of examination equipment (46.6% vs 33.3%) (P = 0.022, P = 0.032 and P = 0.049, respectively). Regarding validity of the OSCE, females agreed to a more extent regarding the reduced gender, ethnicity or personality bias of the exam (63.2% vs 48.1%), and on its measure of competency (91.2% vs 77.7%) (P = 0.001 and P = 0.01, respectively).

Finally, students were asked if they recommend OSCE continues to be used in the future and 80.1% agreed or strongly agreed, however, fewer participants (66%) recommended its use in their final qualifying exam.

Examiners’ Perception of OSCE

The examiners provided response to similar questions. OSCE coverage of different knowledge and skills, adequacy of stations, having positive impact on learning and being a good measure of competency were among the questions with high response of agreement of 74–82%. Availability of equipment for OSCE was considered to be a major challenge by 56.4% and OSCE was not considered stressful by 38.5% (Table 4).

Table 4 Examiner’s Perception of OSCE

There were some similar questions asked to both examiners and students. Among these questions, the following showed significant difference in responses. Students considered OSCE to give more feedback than examiners (63.8% vs 43.6%) (P = 0.015). In addition, examiners (71.8%) considered OSCE to be a more standardized examination than students (50%) (P = 0.016).

Comparison Between Different Assessment Methods

Both students and examiners were asked to compare the different assessment methods. Multiple choice questions (MCQ) and long case exams were considered to be most difficult by both students and examiners, with students considering long case exams to be harder than examiners (39.7% vs 28.2%). Students considered MCQ to be the fairest evaluation tool (53.2%), while examiners preferred OSCE (43.6%). Both students and examiners reported that long case exams are more stressful. However, examiners also considered short case exams to be stressful more than students (20.5% vs 3.5%). Both examiners and students did not consider OSCE to result in failing grade significantly. However, examiners think MCQ is the most likely to result in failing (64.1%), while students think it is long case exam that could have poor outcome (38.3%) (Table 5).

Table 5 Student and Examiner Comparison of Assessment Methods

Discussion

The student response rate in this study was 70.5% and exceeded the sample size required. It was low compared to previous studies.1,15,18,21,22 However, two previous studies from Ethiopia and Nigeria have reported lower student responses of 64.5% and 53.8%, respectively.8,19 The response rate of examiners of 65% exceeded a previous report of 55%.1

Both students and examiners had overall positive perceptions toward the attributes, structure, organization and validity of OSCE. Similarly, general satisfaction with OSCE has been reported previously on a study that evaluated perceptions of both students and examiners.1 Previous reports on students also showed similar high acceptability and objectivity of OSCE.14,18 The majority of students considered OSCE to be a good assessment of knowledge and skills. In addition, it revealed areas of strengths and weaknesses, which are considered to be among the major attributes of OSCE. Similar findings have been reported from previous studies.1,18,23

It was however considered to be stressful by just over half of the students, which was comparative to a study from Nigeria.21 Other studies report inconsistencies about the stressfulness of OSCE. Two third of students in two studies stated OSCE to be stressful, and suggested inadequate preparation for the examination and involvement of external examiners might contribute.18,23 Inadequate time and inadequate standardization have also been proposed to be the causes of stress, along with the need for preparation.1,11,12 Stress was reported irrespective of previous experience with OSCE, which makes familiarity an unlikely mitigating factor.18,23 One third of the examiners agreed about stressfulness of OSCE, a finding relayed previously.1

However, both groups of our participants considered OSCE to be much less stressful compared to other examinations, particularly long case, short case and viva exams. A similar comparison was also reported previously.14,19,24 One study reasoned that it could be the unsympathetic interaction among examiners and students that causes higher stress during long and short case examinations.14 This is in contrast to other previous reports where OSCE was comparatively seen to be more stressful.1,12,18 The difference could be due to the variations in the assessment methods used in these studies, on top of the difference in the study settings. In addition, since one third and one half of the examiners and students, respectively, reported that OSCE was stressful, the report of less stress with OSCE is only relative.

Inadequate time to complete the stations was reported by 27.6% of students and 20.5% of examiners. Time has been reported as a challenge to OSCE by much of the previous reports and was also shared by examiners.8,11,12,18,19,23 One qualitative study reported a student’s suggestion of presence of rest stations to mitigate the impact of time constraints, which gives time to prepare psychologically.8 Another study stating examiner’s explanation to inadequacy of time provided, attributed it to the inadequate preparation of students for the exam.18 This should be corrected because unlike other exams, students cannot report their time constraints while the exam is in process.19 Every station and the time allocated should be assessed well in advance. Regular evaluation of exams should be performed and time should be adjusted accordingly. Setting similar time limits for all stations should not be done, as different stations require variable amount of time to accomplish tasks.

Practice sessions were recommended by most of our participants and were also considered important in other studies.1,8,12 Mock sessions can improve time utilization and exam performance. It can also acquaint students on the exam process and increase their confidence.8 Mock exams with practical demonstrations and clear feedback should therefore be considered before OSCE exams.

Our finding revealed that almost one third of the students were not satisfied with the orientation provided. Two studies reported an adequate orientation before exam which was lacking in our finding.1,18 Pre-examination orientation is important for both students and examiners. Students can also be trained during routine clinical teaching activities. One study reasoned that poor orientation of students can stem from inadequate knowledge and training of examiners themselves. Training of examiners can lead to better standardization of exam scoring.25 Less than a quarter of the examiners received any kind of training on OSCE which can contribute to lowering of the quality of the exam. This highlights the need for training of examiners, including providing refresher courses as needed.

Availability of material resources such as different equipment was considered inadequate by 39.1% of students and 56.4% of examiners. An even larger percent reported it as a challenge previously.1,12 OSCE is a very demanding examination method requiring an abundant amount of human and material resources.25 The college must make availability of simulation materials a priority. A well-organized and complete simulation lab should be prepared both for education and final assessment.

Our participants reported better attributes of OSCE compared to most other assessment tools with regard to difficulty, stressfulness, likelihood to pass, and feedback on strengths and weaknesses. This makes OSCE an ideal examination method. However, both students and examiners considered multiple choice questions to be the most objective assessment method followed by OSCE. The former has the clear advantage of being the most objective, as students are provided with identical questions and time limits. Out of the practical examinations, however, OSCE has higher objectivity that is inherent in its design. This has been echoed previously, including in one study where 97.4% of students considered OSCE more objective than other traditional assessments.3,18,19

The findings from this study show that OSCE is a good assessment method and is considered acceptable by both students and examiners. More than 80% of students recommend it to be used in the future and 71.8% of examiners would like to see it used in other clinical years. Therefore, OSCE needs to be supported and the challenges identified should be mitigated by concerned bodies. In this way, the quality of OSCE and satisfaction of students and examiners can be enhanced.

This study included relatively larger number of participants compared to previous studies. It was a comprehensive evaluation of both students’ and examiners’ perspectives, which are essential to identify challenges and make holistic recommendations. It also assessed OSCE in different departments. There are however limitations to the study. The response rate was relatively low but exceeded the sample size. In addition, it only evaluated OSCE perceptions in one teaching institution.

Conclusions

This study provided a thorough assessment of students’ and examiners’ perception on OSCE. There was an overall positive perception towards OSCE by both examiners and students. Despite this, there were some challenges identified. Practice OSCE sessions before the exam should be prepared. This should be followed by a detailed orientation on the conduct of the examination to students. Time given at stations should be reviewed before exams. These can potentially reduce the stress associated with OSCE and improve the exam process for future students. Examiners should also receive courses and training on objective examination of medical students. In addition, required materials should be made available. Future studies that take these limitations into account need to be done, including a comprehensive nationwide OSCE assessment.

Abbreviations

MCQ, multiple choice questions; OSCE, objective structured clinical examination; SPHMMC, St. Paul’s Hospital Millennium Medical College; SD, standard deviation.

Institutional Review Board Statement

The study was approved by the Institutional Review Board of St. Paul’s Hospital Millennium Medical College (approval code: PM23/693).

Data Sharing Statement

The datasets used in this study are available from the corresponding author on reasonable request.

Informed Consent Statement

Written informed consent was obtained from all subjects involved in the study.

Acknowledgment

We would like to thank all the participants who gave their valuable time and participated in the study.

Funding

The authors did not receive any funding.

Disclosure

Both authors declare no conflicts of interest regarding this work.

References

1. Majumder MAA, Kumar A, Krishnamurthy K, Ojeh N, Adams OP, Sa B. An evaluative study of objective structured clinical examination (OSCE): students and examiners perspectives. Adv Med Educ Pract. 2019;10:387–397. doi:10.2147/AMEP.S197275

2. Tabish SA. Assessment methods in medical education. Int J Health Sci (Qassim). 2008;2(2):3–7.

3. Alsaid AH, Al-Sheikh M. Student and faculty perception of objective structured clinical examination: a teaching hospital experience. Saudi J Med Med Sci. 2017;5(1):49–55. doi:10.4103/1658-631X.194250

4. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–451. doi:10.1136/bmj.1.5955.447

5. Zayyan M. Objective structured clinical examination: the assessment of choice. Oman Med J. 2011;26(4):219–222. doi:10.5001/omj.2011.55

6. Skrzypek A, Szeliga M, Stalmach-Przygoda A, et al. The objective structured clinical examination (OSCE) from the perspective of 3rd year’s medical students - a pilot study. Folia Med Cracov. 2017;57(3):67–75.

7. Larsen T, Jeppe-Jensen D. The introduction and perception of an OSCE with an element of self- and peer-assessment. Eur J Dent Educ. 2008;12(1):2–7. doi:10.1111/j.1600-0579.2007.00449.x

8. Ataro G, Worku S, Asaminew T. Experience and challenges of objective structured clinical examination (OSCE): perspective of students and examiners in a clinical department of Ethiopian University. Ethiop J Health Sci. 2020;30(3):417–426. doi:10.4314/ejhs.v30i3.13

9. Wilby KJ, Diab M. Key challenges for implementing a Canadian-based objective structured clinical examination (OSCE) in a Middle Eastern context. Can Med Educ J. 2016;7(3):e4–e9. doi:10.36834/cmej.36720

10. Brannick MT, Erol-Korkmaz HT, Prewett M. A systematic review of the reliability of objective structured clinical examination scores. Med Educ. 2011;45(12):1181–1189. doi:10.1111/j.1365-2923.2011.04075.x

11. Emadzadeh A, Ravanshad Y, Makarem A, et al. Challenges of OSCE national board exam in Iran from participants’ perspective. Electron Physician. 2017;9(4):4195–4201. doi:10.19082/4195

12. Bani-Issa W, Al Tamimi M, Fakhry R, Tawil HA. Experiences of nursing students and examiners with the objective structured clinical examination method in physical assessment education: a mixed methods study. Nurse Educ Pract. 2019;35:83–89. doi:10.1016/j.nepr.2019.01.006

13. Chong L, Taylor S, Haywood M, Adelstein BA, Shulruf B. The sights and insights of examiners in objective structured clinical examinations. J Educ Eval Health Prof. 2017;14:34. doi:10.3352/jeehp.2017.14.34

14. Shitu B, Girma T. Objective structured clinical examination (osce): examinee’s perception at department of pediatrics and child health, Jimma university. Ethiop J Health Sci. 2008;18(2):47–52.

15. Gelan EA, Essayas R, Gebressilase K. Perception of final year medical students about objective structured clinical examination in the department of general surgery. Ethiop Med J. 2015;53(4):183–189.

16. SPHMMC. About – Saint Paul’s Millennium Medical College. Available from: https://sphmmc.edu.et/about/. Accessed April, 2021.

17. Faul F, Erdfelder E, Lang AG, Buchner A. G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods. 2007;39(2):175–191. doi:10.3758/bf03193146

18. Pierre RB, Wierenga A, Barton M, Branday JM, Christie CD. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med Educ. 2004;4:22. doi:10.1186/1472-6920-4-22

19. Ameh N, Abdul MA, Adesiyun GA, Avidime S. Objective structured clinical examination vs traditional clinical examination: an evaluation of students’ perception and preference in a Nigerian medical school. Niger Med J. 2014;55(4):310–313. doi:10.4103/0300-1652.137191

20. Müller S, Settmacher U, Koch I, Dahmen U. A pilot survey of student perceptions on the benefit of the OSCE and MCQ modalities. GMS J Med Educ. 2018;35(4):Doc51. doi:10.3205/zma001197

21. Nasir AA, Yusuf AS, Abdur-Rahman LO, et al. Medical students’ perception of objective structured clinical examination: a feedback for process improvement. J Surg Educ. 2014;71(5):701–706. doi:10.1016/j.jsurg.2014.02.010

22. Furmedge DS, Smith LJ, Sturrock A. Developing doctors: what are the attitudes and perceptions of year 1 and 2 medical students towards a new integrated formative objective structured clinical examination? BMC Med Educ. 2016;16:32. doi:10.1186/s12909-016-0542-3

23. Khan SA, Aaraj S, Talat S, Javed N. Students’ perception and scores in Paediatrics end-of-clerkship and final professional objective structured clinical examination (OSCE): a comparative study. Pak J Med Sci. 2021;37(2):525–530. doi:10.12669/pjms.37.2.3422

24. Dadgar SR, Saleh A, Bahador H, Baradaran HR. OSCE as a tool for evaluation of practical semiology in comparison to MCQ & oral examination. J Pak Med Assoc. 2008;58(9):506–507.

25. Onwudiegwu U. OSCE: design, development and deployment. J West Afr Coll Surg. 2018;8(1):1–22.

Creative Commons License © 2021 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.