Ronaye Gilsenan, MA1, Rhonda Schwartz, MA2, Iris A. Gutmanis, PhD3, Adam M.B. Day, PhD4, David P. Ryan, PhD, CPsych5, Rosemary R.A. Brander, PhD6, Kelly Milne, OT1, Frank Molnar, MD, FRCPC71Regional Geriatric Program of Eastern Ontario, Ottawa, ON
2Seniors Care Network, Port Hope, ON
3Regional Geriatric Program of South West Ontario, Western University, London, ON
4North East Specialized Geriatric Centre, Thunder Bay, ON
5Regional Geriatric Program of Toronto, University of Toronto, Toronto, ON
6Specialized Geriatric Services, South East Ontario, Queens University, Kingston, ON
7Regional Geriatric Program of Eastern Ontario, University of Ottawa, Ottawa, ON, Canada
While generic, site, and disease-specific patient experience surveys exist, such surveys have limited relevance to frail, medically complex older adults attending appointment-based specialized geriatric services (SGS). The study objective was to develop and evaluate a patient experience survey specific to this population.
Using established survey research methods, this study was conducted collaboratively with older adults (patients and family members/friends) at three Ontario sites offering SGS. The study was done in three phases: Phase One—literature review, evidence alignment, and operationalization of core survey items; Phase Two—cognitive interviews and refinement; and Phase Three—pilot testing, survey item analysis, and refinement.
Based on an evidence-informed framework, the “Older Adult Experience Survey” includes 12 core items, two global rating items, two open-ended questions, and two demographic questions. The summed 12 core items demonstrated acceptable internal consistency (Cronbach’s alpha: 0.83), and the correlation between the summed score and a global question was 0.59, providing evidence of construct validity. The survey also demonstrated face and content validity.
This open access, collaboratively developed, psychometrically sound patient experience survey can be used to assess, then improve, the clinical experience and quality of care of older adults attending appointment-based SGS clinics/programs.
Key words: specialized geriatric services, frail older adults, survey, patient experience, appointment-based
Ongoing assessment of “patient experience”(1) is key to improving health-care quality and reducing costs.(2,3) While generic ,(4) site(5,6,7), and disease-specific(8) patient experience surveys have been developed and implemented, item wording is not specific to appointment-based specialized geriatric services (SGS) and may not have included dimensions relevant to older frail adults.(9) Further, existing SGS patient satisfaction/experience surveys include various wording and rating scales that impede provincial reporting.
In Ontario, a collaborative of 11 regional programs provide SGS to the ever-increasing number of older adults(10) living with, or at risk for, frailty(11) whose health, dignity, and independence are challenged due to multiple complex medical, functional, and psychosocial issues. This SGS collaborative works with primary care physicians, community professionals, and others, by offering a spectrum of hospital and community-based services to older adults.
The objective of this study was to develop and evaluate a minimum set of core survey items for measuring the experience of older adults in appointment-based SGS settings.
Using established survey research methods,(12) this study was conducted in three phases (see Table 1) as approved by the Health Sciences North Research Ethics Board, the Ottawa Health Science Network Research Ethics Board, and The Scarborough Hospital Research Ethics Board.
TABLE 1 Methods used to develop and test the Older Adult Experience Survey
The literature review was built off work done as part of the Canadian primary health-care system renewal,(13) as both SGS and primary care are largely appointment-based services. The work conducted to develop the Ontario Primary Care Performance Measurement Framework,(14) along with Wong and Haggerty’s scoping review(15) and other publicly available articles describing patient experience frameworks and surveys, formed the basis of this non-exhaustive literature review. Conventional search methods included a search of electronic databases (e.g., Google for grey literature and hand searches of key articles). Rigid inclusion/exclusion criteria were not applied, but articles that focused on older adults and appointment-based services were of prime interest. The search was limited to articles written in English between 2002 and 2016.
An evidence-informed patient experience framework was selected based on its relevancy to SGS settings, the SGS population, and alignment with other Canadian work in this area. Following this, a group consensus approach based on the Delphi methodology(16) was used to identify SGS applicable dimensions and sub-dimensions, and to draft item wording for an SGS patient experience survey (see Appendix A).
As informed by Willis and Artino,(17) semi-structured cognitive interviews (see Appendix B & C) were conducted at three SGS sites (Ottawa, Scarborough, Sudbury). A convenience sample of 5–15 older adults(18) was required. Older adults who attended SGS appointment-based services during the study period and who were able to speak and understand English were asked to provide insights into the utility, relevance, and wording of each draft item. Based on their feedback, a pilot version of the survey was finalized using the methodology described in Appendix B.
The pilot version of the survey was tested at two sites (Ottawa, Scarborough) with another convenience sample of cognitively intact older adults. It was determined that 73 patients per site were needed, assuming a 10% margin of error and a 95% confidence interval around a sample proportion of 50%. The statistical analysis was done using SPSS statistical software (SPSS 24, IBM, Armonk, NY, USA). Item-by-item frequency distributions were generated, and Cronbach’s alpha was calculated for the summed core items.
The distribution specific correlation between the global item (“Overall, I felt that the care and services I experienced were…”), and the summed core-item score was determined to provide evidence of construct validity. Correlations among the framework dimensions were then examined. Finally, framework subdimensions were used to code responses to the open-ended questions (What worked well? What could be improved?).
Subsequently, the pilot survey was revised and re-evaluated by a convenience sample of older patients who received SGS services at the Scarborough site. These patients were given both the pilot survey and the revised pilot survey in random order and then participated in cognitive interviews (see Appendix D). Item-by-item frequency distributions were compared using distribution appropriate statistics and qualitative responses were coded.
Wong and Haggerty’s(15) primary care framework was selected and used to guide the development of the SGS patient experience survey. All six framework dimensions and 12/17 subdimensions were deemed applicable to SGS (see Table 2). This Phase One draft survey included 16 core items and two global items.
TABLE 2 Wong and Haggerty(15) dimensions and subdimensions retained for Older Adult Experience Survey
Interviewees (n=19) indicated that the draft survey items measured all key aspects of their patient experience, thereby providing some evidence of both face and content validity. Feedback led to the rewording of eight items and the deletion of four items pertaining to three subdimensions. Based on these findings, a pilot survey was generated that included 12 items scored on a 5-point Likert scale, one global question scored on an 11-point Likert scale, a willingness to recommend item rated on a 4-point Likert scale, and two open-ended questions (see Appendix B).
Of the estimated 257 patients who met the phase three study inclusion criteria, 145 were recruited [Ottawa: n=75/114 (65.8%); Scarborough: n=70/123 (56.9%)]. Due to missing values, summed scores for the core 12 items were generated for 131 patients. Summed scores ranged from 43 to 60. The mean of the summed core items was 56.9 (SD: 3.9) and the median was 59 (interquartile range (IQR): 6). Cronbach’s alpha was 0.83, demonstrating acceptable internal consistency.(19) As the frequency distribution associated with summed scores deviated significantly from a normal distribution (Shapiro-Wilk test: 0.80, p < .001; skewness: −1.180), non-parametric tests (Spearman rho correlations, Mann-Whitney U or Kruskal Wallis tests) were used to assess statistical associations and group differences. The Spearman rho correlation between the 12-item summed score and the global experience rating was 0.59, providing evidence of construct convergent validity. Although inter-domain Spearman rho correlations varied from 0.19 (trust and access) to 0.58 (comprehensiveness of services and continuity and coordination), all correlations were statistically significant (p < .05).
More than 60% of all patients selected the top category for any one of the 12 core survey items and 54 patients (41.2%) selected the top response for all items (see Table 3). Despite relatively little dispersion, differences were detected by time of year and/or by site/program for 6 of the 12 survey items (Mann-Whitney U or Kruskal Wallis test, p < .5), providing some evidence of construct divergent validity. Qualitative survey responses were mapped to nine of the ten framework subdimension and provided further information on perceived strengths and areas for improvement.
TABLE 3 Pilot survey item-by-item analysis
The pilot survey was then revised. Two of the core items were reworded, two demographic items were added, instructions were shortened, anchors for the 12 core survey items and one global assessment question were changed, the survey name was modified, and minor changes were made to formatting.
Testing of the revised pilot survey was conducted with additional patients (n=5). No one expressed concerns regarding the above revisions. Four patients found the revised survey easier to complete. Response selections were identical for four of the 12 core items and only once did a response switch by two points. Further, there were no significant differences in mean summed core-item scores (pilot version vs. revised pilot version: 55.4 [SD: 4.8] and 55.2 [SD: 4.9], respectively) and median scores did not differ significantly by version (56 [IQR: 8.5] vs. 57 , respectively; Mann-Whitney U: p = 1.00).
A framework-based patient experience survey specific to frail, medically complex older adults attending appointment-based SGS was developed and tested by incorporating input from older adults, their family members/friends, and experts in geriatrics and research. When taken together, the 12 core items of the Older Adult Experience Survey demonstrated acceptable internal consistency (Cronbach’s alpha: 0.83). Slightly more than 40% of patients had the maximum score, perhaps accurately reflecting perceived patient experience or perhaps indicating a ceiling effect. Despite this finding, the survey was able to identify meaningful group differences. Users are encouraged to use a mixed methods approach to triangulate qualitative and quantitative information.(20)
Testing was conducted on an English language, paper-based version of the survey at three sites providing appointment-based SGS. Further examination of interrater and test/retest reliability, structure, discriminant validity and response rates is warranted. Psychometric properties will need to be re-evaluated if the survey is translated to another language or to an electronic version.
Future studies may provide evidence of the survey’s clinical utility and ability to identify areas for quality improvement that will lead to improved quality of patient care. Findings may also provide insights for system planners at the local, regional, and provincial levels.
Based on an evidence-informed framework, the collaboratively developed Older Adult Experience Survey demonstrates acceptable internal consistency, as well as face, content, construct convergent and construct divergent validity.
The authors would like to thank the RGPs of Ontario Executive Committee for their help in reviewing the wording of the survey items. We would also like to thank Debbie Daly, RN (EC), MN with the Scarborough Hospital GAIN Clinic and Taryn MacKenzie, RN MN ENC(C) APN with the Ottawa Hospital Geriatric Day Hospital for their help with patient recruitment. Finally, we would like to thank the patients and their family members who provided invaluable feedback and made this a much better survey.
The authors declare that no conflicts of interest exist.
1 Wolf J, Niederhauser V, Marshburn D, et al. Defining patient experience. Patient Exp J. 2014;1(1):7–19.
2 Hibbard JH, Stockard J, Mahoney ER, et al. Development of the Patient Activation Measure (PAM): conceptualizing and measuring activation in patients and consumers. Health Serv Res. 2004;39(4 Pt 1):1005–26.
3 Berwick DM, Nolan TW, Whittington J. The triple aim: care, health and cost. Health Affair. 2008;27(3):759–69.
4 Benson T, Potts HW. A short generic patient experience questionnaire: howRwe development and validation. BMC Health Serv Res. 2014;14(1):499.
5 Slater M, Kiran T. Measuring the patient experience in primary care: comparing e-mail and waiting room survey delivery in a family health team. Can Fam Physician. 2016;62(12):e740–e748.
6 Health Quality Ontario. Primary Care Patient Experience Survey (Version 4-2015) [Internet]. Toronto, ON; n.d. Available from: https://www.hqontario.ca/Portals/0/documents/qi/primary-care/primary-care-patient-experience-survey-en.pdf
7 McMurray J, McNeil H, Gordon A, et al. Psychometric testing of a rehabilitative care patient experience instrument. Arch Phys Med Rehabil. 2018;99(9):1840–47.
8 Saunders CL, Abel GA, Lyratzopoulos G. What explains worse patient experience in London? Evidence from secondary analysis of the Cancer Patient Experience Survey [published correction appears in BMJ Open. 2014;4(1):e004039corr1]. BMJ Open. 2014;4(1):e004039.
9 Staniszewska S, Boardman F, Gunn L, et al. The Warwick Patient Experiences Framework: patient-based evidence in clinical guidelines. Int J Qual Health Care. 2014;26(2):151–57.
10 Ontario Ministry of Finance, Office of Economic Policy. Ontario population projections update, 2018–2046 [Internet]. Toronto, ON: Queen’s Printer for Ontario; 2019. Available from: https://www.fin.gov.on.ca/en/economy/demographics/projections/
11 Fried LP, Tangen CM, Walston J, et al. Frailty in older adults: evidence for a phenotype. J Gerontol A Biol Sci Med Sci. 2001; 56(3):M146–M157.
12 Streiner DL, Norman GR. Health measurement scales: a practical guide to their development and use, 4th ed. Oxford, U.K: Oxford University Press; 2008.
13 Health Council of Canada. Fixing the foundation: an update on primary health care and home care renewal in Canada. [Internet]. Toronto, ON: Health Council; 2008. Available from: https://healthcouncilcanada.ca/files/2.26-HCC_PHC_Main_web_E.pdf. Accessed 2020 December 4.
14 Haj-Ali W, Hutchison, B. Establishing a primary care performance measurement framework for Ontario. [Internet]. Healthc Policy. 2017;12(3):66–79. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5344364/#R4
15 Wong ST, Haggerty J. Measuring patient experiences in primary health care: a review and classification of items and scales used in publicly-available questionnaires. [Internet]. Vancouver, BC: UBC Centre for Health Services and Policy Research; 2013. Available from: https://open.library.ubc.ca/cIRcle/collections/facultyresearchandpublications/52383/items/1.0048528
16 McKenna HP. The Delphi technique: a worthwhile research approach for nursing. J Adv Nurs. 1994;19(6):1221–25
17 Willis GB, Artino AR. What do our respondents think we’re asking? Using cognitive interviewing to improve medical education surveys. J Grad Med Educ. 2013;5(3):353–56
18 Willis GB. Cognitive interviewing: a tool for improving questionnaire design. Thousand Oaks, CA: Sage Publ. Inc.; 2005.
19 Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ. 2011;2:53–55.
20 LaVela SL, Gallan AS. Evaluation and measurement of patient experience. Patient Exp J. 2014;1(1):28–36.
(Return to Top)
Examples of possible probing questions
What does the term “_” mean to you?
Can you repeat the question I just asked in your own words?
How did you come up with your answer?
Was that easy or hard to answer?
I noticed that you hesitated. Tell me what you were thinking.
Canadian Geriatrics Journal, Vol. 24, No. 2, June 2021