Residency applications are both an exciting and stressful part of the fourth year of medical school. Applications are not only an investment in a future career, but a significant financial expense as well. A 2015 survey of US medical students found that on average they applied to 36.4 programs and attended 12.6 interviews each [1]. This process poses a financial burden, as applicants spend an average of $3500 on applications/interviews, with nearly a third spending over $5000, and over half citing financial burden as a limiting factor in attending interviews [2, 3]. The transition to virtual interviews during the COVID-19 pandemic was a significant change to the landscape of residency applications. As a result, 40% of applicants applied to more programs than prior, and over half reported reduced cost and more flexibility [4], adding to the trend of increasing numbers of applications per applicant [5]. The Association of American Medical Colleges favors virtual options for interviews to be less burdensome on applicant time and finances, as well as reduction of carbon emissions [6]. However, more than half of applicants felt it was challenging to determine program culture and goodness of fit from program websites [4]. From the perspective of the residency program, over half of program directors rated their program’s reliance on their website as significant, and a majority reported improving the program website would be beneficial [4]. Applicants in neurosurgery responded that program websites convey useful variables that impacted their rank list, though they believed program websites could use reorganization and do not replace an in-person experience [7].

Psychiatry was among the top five specialties in the 2023 National Resident Matching Program’s Main Residency Match, with 3039 applications for 2164 residency positions [8]. Top factors that psychiatry applicants considered for selecting programs for application are geographic location, goodness of fit, clinical setting, perceived quality of education of the program, and perceived qualities of residents and faculty [9]. An editorial from a program director and an applicant further outlined their recommendation on which important content areas should be included on residency program websites [10]. Websites that address these factors that applicants weigh in selecting and ranking programs will likely be most beneficial. While there have been many assessments of residency program websites over the past few years [11,12,13,14,15,16,17,18,19,20,21,22,23], no evaluation of general adult psychiatry program website content was found in the literature. The consensus of these studies regarding residency program websites in other specialties reflects opportunity for ease of access and both including and updating information that is helpful to applicants, though there is a paucity of literature on the criteria that applicants find helpful. Our hypothesis was that program website content would differ between geographic regions. The goal of this study was to evaluate all US general adult psychiatry websites for informational content that may assist prospective applicants to find their best fit.

Methods

A survey study design was utilized to evaluate all US general psychiatry websites listed in the American Medical Association’s Fellowship and Residency Electronic Interactive Database (FREIDA) [24]. A 44-item rating scale was developed to evaluate each program website based on five constructs: residents, faculty, residency, program, and education (available on request from the corresponding author). The 44-item rating scale was based on similar studies assessing residency websites [23] and website recommendations published in an editorial by a program director and applicant [10], with a focus on content relevant to psychiatry. This 44-item questionnaire was used to survey residency program websites. The study followed a Consensus-Based Checklist for Reporting of Survey Studies (CROSS) reporting guideline and checklist available on the EQUATOR Network. This project was reviewed and declared exempt by the University of Kansas Medical Center Human Research Protection Program.

FREIDA allows medical students to search for a residency or fellowship from more the 12,000 accredited programs [24]. Residency programs were separated into four US Census Bureau geographic regions, West, South, Midwest, and Northeast, which were utilized in another website study [23]. FREIDA lists programs as university-affiliated, community-based university affiliated, and community-based, which were used to identify program type in our rating scale. FREIDA also includes a hyperlink to most programs, which was used to identify the web address of the program website. For programs without a working link, a Google search of [“program name” + “psychiatry residency”] was performed.

Program websites were systematically evaluated between September 2021 through January 2022. Website items were evaluated by two raters and assessed for reliability with the McNemar test for dichotomous outcomes: either an item was present on the website (“yes”) or it was not (“no”). Cronbach’s alpha was conducted to measure overall internal consistency, and Cohen’s kappa was used to measure the extent to which the two raters agreed. Reliability and agreement measures were also conducted by construct, region, program type, and specific website content items. Constructs were defined as five topics (current resident information, descriptions of faculty, residency application and administrative details, program location and setting, and educational opportunities) that were assigned during item development. Regions included Midwest, Northeast, South, and West. Program types were categorized as community-based, community-based/university affiliated, or university-based.

Descriptive statistics were conducted to measure the frequency and percentage of items occurring on each website. Because data tended to be sparse, Fisher’s exact tests were used to assess response differences by region and program type. Chi-square tests were used to assess associations between program type, region, and where the FREIDA link directed the user. Extended Rasch models were conducted to estimate item difficulty parameters, which measured the likelihood of an item being listed on a website. Negative numbers indicated items that were most likely to be included on each website (least difficult to find), while positive numbers indicated those items that were least likely to included (most difficult to find). In addition to difficulty estimates, standard errors (SE) and 95% confidence intervals were reported by item and by construct. A series of multidimensional scaling using alternating least squares algorithm (ALSCAL) was conducted to evaluate the distribution of items; that is, how scores placed item responses in close proximity. These patterns were evaluated by region and by program type using binary Euclidean distance measures for nominal data. Goodness-of-fit measures (how well the overall pattern fit the data) were assessed using Stress and R2. To interpret Stress measures, we used the following categories: 0.20 or more was a poor fit; 0.10 was fair; 0.05 was good; 0.025 was excellent; 0.000 was deemed a perfect fit. All statistical tests were two-sided and conducted in IBM SPSS Statistics, version 26. The level of significance was adjusted to p<0.001 to account for multiple tests.

Results

As of March 1, 2022, there were 285 general US psychiatry programs with 2089 first-year residency positions. Of the 285 general adult psychiatry programs listed in the FREIDA database, 272 programs were evaluated and included in the analysis. Five military residency programs, one government program, and seven programs without websites were excluded. By region, there were 60 programs in the Midwest, 77 programs in the Northeast, 95 programs in the South, and 40 programs in the West. Program setting type was also collected, with 125 university-based programs, 92 community-based/university-affiliated programs, and 55 community-based programs.

A total of 11,960 items from both raters were evaluated; rater 1 had six missing values, and rater 2 had two missing values that were excluded. Results from the overall McNemar test were significant, p<0.001, indicating there was a significant difference in the way raters assessed the items. Discordant pairs account for the majority of these differences: 564 items rated as absent by rater 1 and present by rater 2, and 247 items rated as present by rater 1 and absent by rater 2. After item discrepancies were settled by a third reviewer, the result from Cronbach’s alpha was 0.927, indicating there was high overall internal consistency among the raters. Similarly, Cohen’s kappa showed a high level of agreement: κ=0.863, p<0.001. Rater agreement by construct ranged from κ=0.818 for faculty items to κ=0.913 for resident items. Rater agreement by region ranged from κ=0.841 for institutions located in the West to κ=0.879 for those in the North. Similar results were observed by program type, ranging from 0.850 to 0.881. With regard to reliability by item, ratings for residency application process were the least reliable, κ=0.566, followed by educational rotation information, κ=0.603. Items that were the most reliable included educational didactic information, κ=0.934, and residency fellowships, κ=0.931.

Programs were evaluated using 44 criteria split into five subsections; programs were more likely to report residency information (68%) and least likely to report program-setting criteria (46%). While 13 of 44 criteria were found on >75% of websites, 18 criteria were found on less than half. The top 10% most-often reported items were application process (94%), rotation information (93%), faculty name (92%), and hospital information (88%). The bottom 10% least-often reported items were resident research publications/experience (14%), housing options (15%), social events (20%), and call schedule information (21%) (Table 1).

Table 1 Responses to the 44-item questionnaire

Bivariate comparisons of item responses by region revealed very little difference between regions. Bivariate associations by program type are shown in Table 1. Many significant differences were observed by program type, mainly for faculty information and current residents. University-based programs tended to include more information on their websites than community-based programs. For example, names of current residents appeared more often on university-based websites (97%) vs. community-based (62%). Application process was 98% on university-based websites vs. 87% for community-based. Resident research publications/experience was 22% vs. 4%, and social events was 30% vs. 5%, respectively.

Another interesting and significant finding was the differences observed in the resulting webpage where the FREIDA hyperlink directed users. Links of university-based programs were more often directed to residency information (university-based 42% vs. community-based 27%), whereas links of community-based programs were more often directed to hospital information (university-based 5.6% vs. community-based 56%). Nationally, seven programs lacked a website entirely. However, a further 45 programs did not appear to list a weblink or had a broken weblink listed on FREIDA.

Table 2 shows results from the overall extended Rasch model by item difficulty, listed in order from most difficult to least difficult. Results showed that research publications/experience by residents had a positive score of 2.40, indicating that this item was highly unlikely to appear on any webpage. Conversely, the residency application process had a negative score of –2.71, indicating it almost always appeared on websites.

Table 2 Extended Rasch model by item (category) difficulty

Discussion

Program websites are an easily accessible and important source of information for prospective residency applicants, both in deciding which programs they will apply to and then rank. This was the first study to our knowledge to examine content that may be helpful for applicants on US general psychiatry websites. We focused on content previously discussed in the literature for non-psychiatric programs. To be of use to residency candidates, a website must first be accessible with minimal barriers. Only one-third of general adult psychiatry residency programs posted on FREIDA provided a functional link to their program page. Instead, many of the links directed to generic pages for a department, university, or health system and required a Google search to locate the residency program website. The reliability of FREIDA-provided website links was correlated with program type, with university-based programs being more likely than community-based programs to directly link to program websites.

Our data show that program websites also vary in quality. Initially, our hypothesis was that program websites would differ between regions. The categorical data shown in Table 1 and bivariate analysis show few significant differences between regions for each criterion assessed. Rather, the data reveal that program type (university vs community based) shows higher correlation for significant differences. Website criteria of interest were most likely to be found on university-based program websites and were less likely to be found on community-based program websites. Only three of 44 assessed criteria varied significantly between regions, while 13 criteria differed significantly by program type, with the most variability found for current residents and alumni and current resident names.

Overall, content that fell within a category we labeled “residency information,” which included general information such as salary, benefits, application process, and contact information, was most likely to be reported, and that within “program setting,” which included data regarding hospitals, clinics, local attractions, and patient population, was least likely to be reported. This finding is concerning during a time of virtual interviews, where the website could help assist prospective applicants in evaluating their best fit with a program. A survey of applicants in a separate specialty rated criteria on clinical sites, rotation and call schedules, resident and faculty information, and videos as most helpful [7]. Most concerning was that roughly 40% of the information for which we evaluated (14 out of 44 criteria) was not accessible on more than half of general adult psychiatry program websites reviewed. Only one-third of assessed criteria were present on a majority (>75%) of websites.

With regard to comparisons by region, the West tended to report more criteria, while the South reported the least. The Midwest reported patient population at the lowest rate. The Northeast and the Midwest report alumni fellowships/jobs more frequently than the South and West. Interestingly, alumni prospects did not appear to be valued more by academic programs, as the South contains the most university-based programs. Though not significant, housing options was one of the lowest reported variables, though the East and West reported this at two to three times the rate of the Midwest and South, which may be explained by increased housing prices in these locations.

When examining statistically significant differences between program type, a more consistent picture emerged. Though all criteria were reported by university-based programs at higher rates, this trend held consistent with each statistically significant criterion. This difference may be due to a variety of factors that would be difficult to capture with certainty without delving into institutional specifics, but leaves room for speculation. University-based programs may have more resources and institutional support for their residency programs overall, which may translate to a more comprehensive program website. Additionally, information regarding alumni, faculty profiles, and grand rounds may have a more academic focus and correspondingly are presented on university-based program websites at a higher rate. These results illustrate the opportunity for community-based programs to update their program websites. With the creation of new general psychiatry residency programs over the past few years, barriers for young programs in creating their websites may be reduced resources and less content. It is not clear if these newer programs represent one program type or region more than another. These results illustrate the opportunity for community programs to update their website.

The use of two separate raters, with a third rater serving to resolve differences, in website survey and data collection, coupled with our reassuring interrater reliability, was a strength of this study. However, all raters came from the same program, which could have introduced bias. Nearly 12,000 data points were examined and analyzed as part of this thorough study that examined all US general psychiatry residency programs. Three authors collected data from websites, but there was not a standardized role for authors for primary assessment or review, which may have impacted the data collected. Another limitation was the period of our data collection, which occurred over 4 months. Active updating of program websites may have occurred over these 4 months, which a shorter survey period would have addressed. Another limitation was that the quantity of criteria for each program may have led to assessment fatigue during evaluation. However, these two limitations lend to the “real-world” value of our assessment, as these limitations would similarly impact prospective applicants. Criteria selected for website evaluation in this study were also assumed to be informative and important to prospective applicants.

This study has benefit for academic psychiatry at large as it characterizes the current content available on residency program websites and is generalizable to all programs. We would recommend that programs update their website links on FREIDA to improve access to program information for prospective applicants. Additionally, addressing content areas found less frequently by this study would improve program website utility to prospective applicants. Though websites varied in quality by program type, all programs should assess their website through the lens of a prospective applicant for content areas assessed in this study. Further work in this area could assess the attitudes of prospective psychiatry applicants on which information would be most helpful to have accessible on residency program websites for building application and rank lists.

In conclusion, a majority of US general psychiatry residency programs maintain program websites and are accessible via FREIDA. Survey of these websites by study raters using a 44-item questionnaire with high inter-rater reliability revealed significant differences. Though comparison by program region showed isolated small trends, comparisons of program differences by program type were significant due to a myriad of possible factors. Development of program websites with up-to-date and helpful information in accordance with best practices and would be helpful to both prospective residency applicants and programs.