Effectiveness of a Core-Competency–based Program on Residents’ Learning and Experience

Lesley Charles, MBChB, CCFP, Dip COE, Jean Triscott, MD, CCFP, Dip COE, Bonnie Dobbs, PhD, Jasneet Parmar, MBBS, Peter George Tian, MD, MPH, FPSO-HNS, FPCS, Oksana Babenko, PhD
Division of Care of the Elderly, Department of Family Medicine, University of Alberta, Edmonton, AB.


DOI: http://dx.doi.org/10.5770/cgj.19.213


ABSTRACT

Background

The Care of the Elderly (COE) Diploma Program is a six-to-twelve-month enhanced skills program taken after two years of core residency training in Family Medicine. In 2010, we developed and implemented a core-competency–based COE Diploma program (CC), in lieu of one based on learning objectives (LO). This study assessed the effectiveness of the core-competency–based program on residents’ learning and their training experience as compared to residents trained using learning objectives.

Methods

The data from the 2007–2013 COE residents were used in the study, with nine and eight residents trained in the LO and CC programs, respectively. Residents’ learning was measured using preceptors’ evaluations of residents’ skills/abilities throughout the program (118 evaluations in total). Residents’ rating of training experience was measured using the Graduate’s Questionnaire which residents completed after graduation.

Results

For residents’ learning, overall, there was no significant difference between the two programs. However, when examined as a function of the four CanMEDS roles, there were significant increases in the CC residents’ scores for two of the CanMEDS roles: Communicator/Collaborator/Manager and Scholar compared to residents in the LO program. With respect to residents’ training experience, seven out of ten program components were rated by the CC residents higher than by the LO residents.

Conclusion

The implementation of a COE CC program appears to facilitate resident learning and training experience.

Key words: care of the elderly, core competencies, enhanced skills, diploma, resident

INTRODUCTION

Care of the Elderly (COE) Diploma programs provide supplementary training on geriatric care to family physicians who have finished a two-year family medicine residency or who are already in practice.(1) The program aims to improve the quality and availability of geriatric care in the face of a growing elderly population and the limited number of geriatricians. The College of Family Physicians of Canada (CFPC) recognized the COE program in 1989.(2) Since then, the program has grown and is now offered in 15 Canadian medical schools.(3)

The COE Program at the University of Alberta was established in 1993. On initiation of the program, residency training was based on learning objectives, with these learning objectives designed to provide residents with requisite medical knowledge and clinical assessment skills. In 2010, we shifted to a program based on core-competencies. The primary goal for introduction of this core-competency–based program was to improve training outcomes. Core competencies relate to skills, behaviours, and knowledge that should be gained through a course or series of courses.(4) Unlike learning objectives, core competencies define expected levels of overall competence for practice. The change from learning objectives to core competencies was in response to the call for residency programs to use competency-based assessments.(4) Through a three-step iterative process described elsewhere,(5) we identified and selected 85 core competencies, each defined as fundamental knowledge, skill set, ability or expertise in a specific subject area. These competencies overarch all components of the COE program—defining rotation evaluations, overall evaluations, Academic Half-Day curriculum, and Exit Examination.

The objectives of this study were to assess the effectiveness of a core-competency–based program on residents’ (1) learning and (2) training experience, as compared to residents trained using learning objectives. We hypothesized that the core-competency–based program would result in improved learning and training experience as compared to the program based on learning objectives.

METHODS

Study Design

We used a pre-test/post-test design among residents who graduated from the program. The pre-intervention period, defined as calendar years 2007–2009, had residents trained using a learning-objective–based (LO) program. The post-intervention period, defined as calendar years 2010–2013, had residents trained using a core-competency–based (CC) program. This study received ethics approval from the Health Research Ethics Board at the University of Alberta.

Setting

The study involved residents in the Care of the Elderly Diploma Program in the Department of Family Medicine, University of Alberta. All graduates of the program in calendar years 2007–2013 were included in the study. The program typically accepts one to three residents a year; hence, for the seven-year period, we had 17 residents as participants in the study.

Intervention and Outcome Measures

The intervention was the implementation of a CC program in 2010. This program is based on the residents’ acquisition of 85 core competencies across 12 domains: cognition, function, mobility, medication, biology of aging, adverse events, incontinence, transitions of care, health-care planning, professionalism, communication, and research.(5) Prior to this intervention, the program was based on learning objectives.

To assess the effectiveness of the intervention, we measured two outcomes: (1) residents’ learning, and (2) residents’ training experience. Learning was measured using the preceptors’ evaluations of the residents’ skills/abilities throughout the training rotations (e.g., acute care, geriatric psychiatry, longitudinal clinic). Each evaluation was made using a standardized evaluation form. This was a two-page form containing Likert scale items assessing residents’ learning according to the Canadian Medical Education Directives for Specialists (CanMEDS) roles: Family Medicine Expert, Communicator/Collaborator/ Manager, Professional/Advocate, and Scholar. The Likert scale ranged from 1 (‘rarely meets’ expectations) to 5 (’consistently exceeds’ expectations), with an option for N/A (not applicable) (see Table 1). There also was an open-ended question for preceptor comments and a checkbox for overall assessment (‘pass’ or ‘requires program review’).

TABLE 1. Example items from the Rotation Specific Evaluation Form for each of the four CanMEDS roles

 

Training experience in the program was assessed, after graduation, using an emailed questionnaire: The COE Graduates Questionnaire, an adaptation of the Department of Family Medicine’s Graduate Survey. The questionnaire was a five-page survey with sections on demographics, professional/practice characteristics, and COE Diploma Program’s characteristics. The items on professional/practice characteristics asked residents about the scope, location, and satisfaction of their practice. Residents responded to the items on program characteristics by rating the strengths of the COE program, as well as the extent to which it prepared them for clinical practice (see Table 2).

TABLE 2. Example items in the Care of the Elderly Graduates Questionnaire

 

Statistical Analyses

Descriptive statistics (percentage, average, and range) were used to describe demographics. Inferential statistics were used for between-group analyses. Generalized Estimating Equations (GEE) analysis was used to assess differences in residents’ learning as a function of training program (LO vs. CC). GEE, a semi-parametric technique, exploits the full potential of longitudinal data taking into account the lack of independence of measures within subjects (i.e., multiple evaluations of each resident in the present study). For the GEE analyses, main effects of Intervention (i.e., LO vs. CC) and resident’s Sex and their interaction (Intervention x Sex) were used as factors, with residents’ Age used as a covariate, to examine the residents’ overall performance and on each of the four CandMEDS roles. To assess between group differences on residents’ training experience, the t-test for independent samples and the z-test for proportions were used.

RESULTS

There were a total of 118 evaluations made by preceptors for 17 residents (nine residents in the LO program and eight in the CC program). Overall, the average age of residents was 39 years (SD = 7 years; range: 29–53 years), with a predominance of females (76%). The average age of residents in the LO program was 41 years (SD = 5 years; range = 36–50 years) with 89% females. For the CC program, the average age was 37 years (SD = 9 years; range = 29–53 years) with 63% females. Overall, residents completed an average of seven rotations (range: 3–12), with the number of rotations dependent on whether the program was for 6 or 12 months.

Residents’ Learning

The average scores for preceptor evaluations of the overall residents’ learning and on each of the four CanMEDS roles (Family Medicine Expert, Communicator/Collaborator/Manager, Professional Advocate, and Scholar) as a function of the training program are shown in Table 3, with the averages provided for female and male residents. There was no difference in the overall residents’ learning between the LO (average = 128.93) and CC programs (average = 130.98), when collapsed across the Sex factor (p > .05).

TABLE 3. Average evaluation scores on CanMEDS roles among residents in the Core-Competency-Based and Learning-Objective-Based programs (n=118 evaluations)

 

Family Medicine Expert Role

The GEE analysis revealed that the Intervention x Sex interaction effect and the main effect of the Intervention were significant (p < .05). However, the main effect of resident’s Sex was not statistically significant (p > .16). As can be seen in Figure 1, the average evaluation score for male residents in the LO program was slightly higher than for female residents (61.22 vs. 58.01, respectively). However, this pattern reversed in the CC program, with the average evaluation score for female residents being higher than for male residents (59.95 vs. 49.66, respectively).

 


 

FIGURE 1. Average evaluation score among residents in the LO-based and CC-based programs for the CanMEDS Family Medicine Expert role

Communicator/Collaborator/Manager Role

There were significant main effects for Intervention (p < .05) and Sex (p < .001), with no significant interaction effect (p > .45). As can be seen in Figure 2(a), the average evaluation score was higher for residents in the CC program than in the LO program (27.52 vs. 24.83, respectively; p < .05). For the main effect of Sex (see Figure 2(b)), the average evaluation score of female residents was significantly higher than for male residents (28.53 vs. 23.82, respectively; p < .01).

 


 

FIGURE 2. CanMEDS Communicator-Collaborator-Manager role: (a) average evaluation score among residents in the LO-based and CC-based programs; (b) average evaluation score for male and female residents

Professional/Advocate Role

There were no significant differences on preceptors’ evaluations of residents for the Professional/Advocate role as a factor of Intervention (p = .22), Sex (p = .73), and Intervention x Sex (p = .12). Specifically, average evaluation scores for male and female residents in the LO program were 25.31 and 27.22, respectively. In the CC program, the average scores for male and female residents were 29.09 and 26.33, respectively.

Scholar Role

There were significant main effects for Intervention (p < .01) and Sex (p < .01) for the Scholar role. Specifically, the average scores for residents increased from 18.06 in the LO program to 20.25 in the CC program (see Figure 3(a)). With respect to Sex (see Figure 3(b)), the average score for female residents was 20.50, which was significantly higher than the average score (17.81) for male residents. The interaction effect was not significant (p = .62).

 


 

FIGURE 3. CanMEDS Scholar role: (a) average evaluation score among residents in the LO-based and CC-based programs; (b) average evaluation score for male and female residents

Variability in Residents’ Evaluation Scores Across CanMEDS Roles

In addition to assessing the differences in average scores between the residents in the CC and LO programs, we also examined the within-group variability. Results indicated that ratings for two of the CanMEDS Roles were far more variable than for the remaining two roles. Specifically, the Family Medicine Expert and Communicator/Collaborator/Manager roles showed patterns of greater variability among residents in the LO program than those in the CC program. An example of the differences in variability can be seen in Figure 4. As shown in the figure, the evaluation scores across residents in the LO program were far more variable than the scores across residents in the CC program. In the Family Medicine Expert role, this variability was observed in 13 out of 16 items. The ratings for Professional/Advocate and Scholar roles did not show much variability among residents in the CC and LO programs (see Figure 5).

 


 

FIGURE 4. Variability of residents’ evaluation scores in the LO-Based and CC-Based programs on an item on the Family Medicine Expert role; residents in the LO program showed more variability in their scores than residents in the CC program

 


 

FIGURE 5. Lack of variability of residents’ evaluation scores in the LO-Based and CC-Based programs on an item on the Professional/Advocate role

Training Experience

The survey was sent to all nine residents in the LO program, with a response rate of 78% (n = 7). For the eight residents in the CC program, the survey was sent to five who had completed the program at the time of the study, with a response rate of 100% (n = 5). The three other residents in the CC program were not given surveys because they were still in training at the time of the study. As can be seen in Figure 6, on seven out of ten program components, percentages of residents rating each program component as a strength within the program were higher in the CC program than in the LO program. Of the seven, the following program components were significantly different: Admission process, Orientation to the program, and Evaluation process of the residents (all p < .05). The differences between the two groups on the remaining program components were not statistically significant (all p > .05). Practice characteristics and other results of the Graduates Training Questionnaire are described elsewhere.(3)

 


 

FIGURE 6. Percentage of residents in the LO-based and CC-based program who responded ‘Strength within the Program’ for 10 program components

DISCUSSION

A CC-based program allows for assessment of residents’ clinical competence. With the implementation of the CC-based program, our COE residents have improved their learning and have rated the program highly as compared to residents in the LO-based program. The 85 core competencies overarch our entire program and have led to less variability in the evaluations of residents since their introduction. Of interest, this decreased variability is domain-specific, and is seen in two of the four roles but not seen in the Professional/ Advocate and Scholar roles. This is not unexpected, as the skills in the Professional/ Advocate and Scholar roles are developed over many years of medical practice.

Core competencies are increasingly being used in training programs. As of 2002, the Accreditation Council for Graduate Medical Education has required residency programs to document mastery of six Core Competencies (Patient Care, Medical Knowledge, Practice-Based Learning and Improvement, Interpersonal and Communication Skills, Professionalism, and Systems-Based Practice) and have identified 360 Degree Feedback as useful in doing this.(6) Subsequently, in 2005, a survey of American Family Medicine programs showed that 257 out of 287 programs (90%) had begun to implement evaluation programs using precepting, record review, and logs. The survey indicated that Patient Care was identified as the most important core competency, with time listed as the major barrier to implementing core competency evaluation methods.(7) These evaluation methods are similar to the sentinel habits used in the University of Alberta’s Family Medicine and Care of the Elderly Programs as part of the Competency-Based Achievement System (CBAS).(8) This system, which makes use of field notes to provide feedback to residents on a daily basis, ensures core competencies are achieved (see Appendix A). More recently (2013), osteopathic competencies in geriatrics were developed using consensus of a panel of experts. The process resulted in the addition of 14 new competencies and one new domain for osteopathic students to the American Geriatric Society’s 26 core competencies over eight domains.(9)

One of the limitations of our study is the small sample size. Like many other residency programs, the number of residents enrolled per year is limited. However, our results are informative because many publications detail core-competency development, implementation, and evaluation, but none describe outcomes of implementation. Further, we note that the consistency in the data across years, in both the LO and CC programs, suggests that the findings may be consistent with a larger sample. Another limitation of our study is the reduced breadth of our core competencies. The competencies were selected using consensus and many competencies initially identified were eliminated due to a lack of consensus. What is unknown is whether these core competencies would have been included with a different expert panel.

Future research in this area includes more long-term evaluation of the effectiveness of the CC-based program. We will continue to evaluate our program and, as we accumulate more data, we will be able to more adequately discern if there are significant differences on the effectiveness of the CC-based program. Finally, we intend to collaborate with other COE Diploma programs in Canada on the implementation of core competencies, and continue to evaluate their relevance to training and practice.

CONCLUSION

The implementation of a COE CC-based program appears to facilitate resident learning and training experience. Further development of the CC-based program is currently underway, with the intent to have the program accepted and adapted nationally.

ACKNOWLEDGEMENTS

This study received funding from the Department of Family Medicine at the University of Alberta.

CONFLICT OF INTEREST DISCLOSURES

All the authors have no conflict of interests to declare.

REFERENCES

1. The College of Family Physicians of Canada. Specific standards for family medicine residency programs accredited by the College of Family Physicians of Canada (The Red Book). Mississauga, ON: College of Family Physicians of Canada; 2013.

2. Alberta College of Family Physicians. Final report on care of the elderly. Edmonton, AB: Alberta College of Family Physicians; 2002.

3. Charles LA, Dobbs BM, McKay RM, et al. Training of specialized geriatric physicians to meet the needs of an aging population—a unique care of the elderly physician program in Canada. J Am Geriatr Soc. 2014;62(7):1390–92.
cross-ref  pubmed  

4. Oandasan I, Saucier D, editors. Triple C Competency-based Curriculum. Report Part 2: advancing implementation. Mississauga, ON: College of Family Physicians of Canada; 2013. Available from: www.cfpc.ca/uploadedFiles/Education/_PDFs/TripleC_Report_pt2.pdf. Accessed 2015 Jan 27.

5. Charles L, Triscott JA, Dobbs BM, et al. Geriatric core competencies for family medicine curriculum and enhanced skills: care of elderly. Can Geriatr J. 2014;17(2):53–62.
cross-ref  pubmed  pmc  

6. Rodgers KG, Manifold C. 360-degree feedback: Possibilities for assessment of the ACGME core competencies for emergency medicine residents. Acad Emerg Med. 2002;9(11):1300–04.
cross-ref  pubmed  

7. Delzell JE Jr, Ringdahl EN, Kruse RL. The ACGME core competencies: a national survey of family medicine program directors. Fam Med. 2005;37(8):576–80.
pubmed  

8. Ross S, Poth CN, Donoff M, et al. Competency-based achievement system: Using formative feedback to teach and assess family medicine residents’ skills. Can Fam Physician. 2011;57(9):e323–30.
pubmed  pmc  

9. Noll DR, Channell MK, Basehore PM, et al. Developing osteopathic competencies in geriatrics for medical students. J Am Osteopath Assoc. 2013;113(4):276–89.
pubmed  



Correspondence to: Lesley Charles, MBChB, CCFP, Dip COE, Division of Care of the Elderly, Department of Family Medicine, University of Alberta, 1259, 10230 111 Ave., Edmonton, AB T5G OB7, Canada, E-mail: Lcharles@ualberta.ca

(Return to Top)


APPENDICES

Appendix A:

Sentinel Habits

  1. Incorporates patient context: Incorporates the patient’s experience and context into problem identification and management

  2. Differential diagnosis: Generates relevant hypotheses resulting in a safe and prioritized differential diagnosis

  3. Uses best practice to manage: Manages patients using available best practices

  4. Prioritizes issues: Selects and attends to the appropriate focus and priority in a situation

  5. Key features for procedures: Uses generic key features when performing a procedure

  6. Respect and responsibility: Demonstrates respect and/ or responsibility

  7. Verbal/written communication: Verbal or written communication is clear and timely

  8. Helps others learn: Teaches to relevant and achievable objectives

  9. Promotes practice quality improvement: Participates with practice/quality management

  10. Seeks guidance and feedback: Practices informed and guided self-assessment



Canadian Geriatrics Journal, Vol. 19, No. 2, June 2016