Background: Few studies have systematically compiled the body of research on the assessment process in competency-based medical education (CBME) and identified knowledge gaps regarding the structure of the evaluation process. Therefore, the objectives of this study were to develop an assessment framework model for competency-based medical education that would be applicable in the Indian context, and to thoroughly examine the assessment framework of CBME.
Methods: The databases consulted were PubMed, MEDLINE (Ovid), EMBASE (Ovid), Scopus, Web of Science, and Google Scholar. Search parameters were restricted to English-language publications on competency-based education and assessment methods, published between January 2006 and December 2020. A descriptive overview of the included studies (in tabular form) served as the basis for data synthesis.
Results: The databases yielded 732 records, of which 36 met the inclusion and exclusion criteria. The 36 studies consisted of a mix of randomized controlled trials, focus group interviews, and questionnaire-based studies, including cross-sectional studies, qualitative studies (03), mixed-methods studies, etc. The articles were published in 10 different journals, with the highest number appearing in *BMC Medical Education* (18). The average quality score of the included studies was 62.53% (range: 35.71–83.33%). Most authors were from the United Kingdom (07), followed by the United States (05). The included studies were grouped into seven categories based on their dominant focus: shifting from a behaviorist to a constructivist approach to assessment (01 study), formative assessment (FA) and feedback (10 studies), barriers to the implementation of feedback (04 studies), use of computer-based or online formative tests with automated feedback (05 studies), video feedback (02 studies), e-learning platforms for formative assessment (04 studies), and studies related to workplace-based assessment (WBA)/mini-clinical evaluation exercise (mini-CEX)/direct observation of procedural skills (DOPS) (10 studies).
Conclusions: Various constructivist techniques such as concept maps, portfolios, and rubrics can be used for assessments. Self-regulated learning, peer feedback, online formative assessment, computer-based formative tests with automated feedback, the use of a web-based computerized Objective Structured Clinical Examination (OSCE) system, and the use of narrative feedback instead of numerical scores in mini-CEX are ways to enhance student engagement in the design and implementation of formative assessment.
Gupta SK, Srivastava T. Assessment in Undergraduate Competency-Based Medical Education: A Systematic Review. Cureus. 2024 Apr 11;16(4):e58073. doi: 10.7759/cureus.58073.
García Dieguez M. Comment on the Assessment in Undergraduate Competency-Based Medical Education: A Systematic Review. [Internet]. Pan American Health Organization. Bibliographic Repository. Cited on 07/10/2025. Available at: https://campus.paho.org/en/repo/assessment-undergraduate-competency-based-medical-education-systematic-review
García Dieguez M.
CEEProS Universidad Nacional del Sur
This article is useful for those working as educators in direct contact with students or involved in the implementation and management of undergraduate competency-based medical education (CBME) programs. It is a systematic review that examines the current state of assessment practices within CBME, focusing on countries that have already begun this transition. While not particularly innovative, the article provides a solid collection of information for faculty members seeking to use valid and coherent instruments aligned with the competency-based approach.
The review included 36 studies with properly assessed quality. These studies cover experiences from multiple countries and highlight significant diversity in the assessment approaches employed. The most frequently used methods include mini-CEX, DOPS, portfolios, rubrics, and simulation-based performance assessments (such as OSCEs).
The article shows that although many programs have adopted elements of CBME, assessment remains a critical issue. Traditional tools persist, faculty training in authentic assessment is limited, and there is often a misalignment between expected competencies and the evidence evaluated. Additionally, only a small number of studies have assessed the reliability and validity of the instruments used.
From a methodological standpoint, the article adheres to PRISMA guidelines and features a well-defined systematic search, use of validated quality appraisal tools, and detailed classification of instruments and competency levels assessed. A notable limitation is that only English-language articles were reviewed.
- Updated mapping of assessment instruments used in undergraduate CBME.
- Instruments should be locally validated before institutional adoption.
- Reinforces the value of portfolios and clinical performance assessments as longitudinal tools.
- Faculty should be trained in new forms of authentic assessment.
- Suggests integrating formative strategies with structured, performance-centered feedback.