Introduction

In 2014, the Joint Task Force for Clinical Trial Competency (JTF) published its Core Competency Framework [1] that described eight domains and 51 core competencies to define the knowledge, skills, and attitudes necessary for professionals to conduct ethical, high quality, and safe clinical research. Several issues relating to the conduct of clinical research prompted the JTF to develop the Framework: there was little or no standardization of role descriptions; no generally accepted knowledge base or skill set that defined an entry level, skilled, or expert professional; no required educational preparation for entry into the field; and hiring and promotions were based primarily on previous experience with no definition of experiential content. Since its original publication, the JTF Framework has been utilized internationally by academic institutions, corporate entities, professional associations, educational and training programs, and regulatory agencies [2,3,4,5,6,7,8,9] and integrated into their activities. In 2017, the Multi-Regional Clinical Trials Center of Brigham and Women’s Hospital and Harvard (MRCT Center) assumed responsibility for the JTF and helped to broaden the recognition and integration of the Framework into the clinical research enterprise through facilitation of an ongoing update process, translation of the Framework into Spanish and Japanese, and creation of a comprehensive and highly interactive web site (https://mrctcenter.org/clinical-trial-competency/). The JTF has continually updated and expanded the scope of the Framework to include additional roles (e.g., project management) and reflect changes in the scientific understanding and methodology utilized in conducting clinical research. In 2018, the Framework version 3.1 was expanded to reflect the different competencies required of Entry (Fundamental) level, Mid-level (Skilled), and Expert (Advanced) level professionals [10].

In 2016, the JTF published the results of a global survey [11] where members of the clinical research enterprise self-assessed their competency based upon the JTF Framework. The results of that survey were used to help better define the varying roles within the clinical research enterprise and inform the major educational and training needs within the workforce. With further global dissemination over time, partitioning the Framework to include Fundamental, Skilled, and Advanced competencies, and the significant expansion and societal recognition of the clinical research enterprise, the JTF wished to reassess the potential of the Framework to guide ongoing professional development. The JTF also considered it important to characterize the current competencies of different roles in this workforce to determine how integration of the Framework will continue to impact the enterprise. Consequently, the JTF developed a second iteration of the global survey of self-assessed competency utilizing the latest version of the Framework, to determine how self-assessed competency can vary across the roles, experience, professional certifications, and academic preparation of the workforce. This publication describes that survey, the global response, and current recommendations for use of the framework to guide the ongoing education and training of the clinical research workforce.

Methods

Survey Tool and Participant Recruitment

An electronic survey tool was developed which was based on the 2016 survey but asked the respondent to self-assess competence in each domain rather than for each specific competency and utilized the online survey RedCap platform. These modifications enabled comparison of the results to the 2016 survey but minimized survey fatigue and increased the ease of digital distribution and response. The survey was provided in both English and Spanish. The survey included a self-assessment of perceived personal competence in each of the eight domains of the JTF Core Competency Framework. Responses were requested based upon a Likert scale from 1 to 10 and specific examples of competency for each domain were described at the Fundamental Level (1–3), the Skilled Level (4–7), and the Expert Level (8–10). A demographic component of the survey followed asking for professional role, experience and academic degree level, geographic location, and several questions concerning participation in education and training programs. Respondents were also asked if they received certifications by recognized professional bodies in clinical research, since such certification is a common mechanism used to recognize qualification in the field.

A copy of the survey instrument is provided as Supplemental Table 1.

Individuals working in clinical research, inclusive of the roles of principal/co-principal investigator (PI/Co-PI), clinical research associate/monitor (CRA), clinical research coordinator/nurse (CRC/CRN), regulatory affairs professional (RA), clinical project manager/research manager (PM/RM), and educator/trainer were targeted as survey participants.

A snowball sampling approach was used for survey dissemination that included outreach through professional contacts, social media, and the collaboration of related professional organizations. The survey was launched on June 1, 2020 and closed on November 30, 2020. Participation in the survey was anonymous and no record of the IP addresses of respondents was collected. Because this survey was devised as a snowball sample, population denominators could not be estimated, and the results were not interpreted as being representative of the clinical and translational research workforce considered as a whole. Descriptive statistics were used to evaluate the distribution of respondents’ assessment scores within and across subgroups of the workforce.

The research protocol was approved as “exempt” research by the Massachusetts General Brigham (Partners) Institutional Review Board.

Results

Survey responses were received from 825 individuals across the globe. A total of 661 (80%) completed the entire survey; only the results of completed surveys were analyzed.

Perceptions of Competency by Role

Table 1 shows the average level of self-assessed competency by domain and by role. As expected, roles with supervisory or education/training responsibilities scored higher. Based upon the average rating, all members of the clinical research team rate themselves as functioning within the Skilled or Advanced category. Respondents reported having the lowest levels of self-confidence in the domains of Scientific Concepts and Research Design, Investigational Product Development and Regulation, and in Data Management and Informatics.

Table 1 Average self-assessed competency rating by role and by domain

Competency by Experience

Table 2 shows the average level of self-assessed competency by experience for each of the domains of the JTF Competency Framework for individuals working in clinical research. As would be expected, the competency level rises as years of experience increases. Generally, the hiring requirements for positions working with sponsors and CROs state that applicants must have more than 2 years of experience [12]. The Association of Clinical Research Professionals (ACRP) and the Society of Clinical Research Associates (SoCRA), the two major professional certifying bodies for clinical research professionals, require 2 years of experience to qualify to take their certification examinations [13, 14]. It is notable that respondents with 0–2 years of experience self-assess at levels much lower than those with 3 years or more experience for all domains.

Table 2 Self-assessed competency rating by experience

Competency by Academic Degree Level

Table 3 shows the average self-assessed competency level by academic degree level for the eight domains of the JTF Competency framework. Considered overall, the level of self-assessed competency increases steadily for each domain with the level of academic degree. Respondents holding no academic degree rated themselves with lower levels of competency in all of the domains. Irrespective of educational degree, lower levels of competency are observed in the domains of Ethical and Safety Considerations, Investigational Product Development and Regulation, and Data Management and Informatics. Ratings of Study and Site Management were more variable.

Table 3 Average self-assessed competency rating by educational level by domain

Professional Certification and Self-assessed Competency

Two professional credentialling organizations for the clinical research profession are the Association of Clinical Research Professionals (ACRP) and the Society of Clinical Research Professionals (SoCRA). Of the respondents who completed the survey, 274 reported being certified by either ACRP or SoCRA or both and 306 reported holding no professional certification. Table 4 shows the average self-assessed competency ratings of ACRP and SoCRA certified and non-certified clinical research professionals for each of the JTF Competency Framework domains. Respondents with no professional certification rate themselves lower in each of the JTF Framework domains than those who are professionally certified by ACRP and/or SoCRA.

Table 4 Average self-assessed competency rating by domain and professional certification

Competency by Organizational Affiliation

The clinical research enterprise is composed of sponsors, contract research organizations (CROs), clinical sites (both in academic institutions and private), and academic institutions that educate clinical research professionals. Table 5 shows the average self-assessed competency level by organizational affiliation and by domain. As can be seen, the averaged self-assessments are at a high level for all domains, approaching Expert in many domains and there is little variation in self-assessed competency between organizational affiliations. In general, respondents reported lower levels of self-assessed competency in Scientific Concepts and Research Design, Investigational Product Development and Regulation, and Data Management and Informatics regardless of their organizational Concepts and Research Design than other respondents. Respondents in corporate pharmaceutical organizations reported higher levels of competency in Investigational Product Development and Regulation and in Communication and Teamwork, while individuals in private clinical sites reported lower levels of competency in the Scientific Concepts and Research Design and the Communication and Teamwork domains.

Table 5 Average self-assessed competency rating by organizational affiliation

Competency and Education/Training

One of the original objectives of the JTF in developing the Core Competency Framework was to help make clinical research professionals aware of their education and training needs and to stimulate further efforts to enhance their competency. Of the 661 respondents that completed the survey, only 174 (35.7%) indicated that they had attended academic coursework and/or continuing education or training (“coursework/training”) in the past 2 years. Table 6 shows the percentage of respondents that received coursework/training in each of the domains. Of those who did attend coursework or training, the topics with the highest percentages included the domains Clinical Study Operations (GCPs) (57%), and Ethical and Safety Considerations (46%). These were the domains showing the highest levels of competency in almost all groups. The areas of education/training with the lower percentages of attendance were Data Management and Informatics (25%), Investigational Products Development and Regulation (28%), Communication and Teamwork (28%), and Scientific Concepts and Research Design (30%), the domains consistently showing the lowest levels of competency for almost all groups.

Table 6 Percent of respondents that received education/training within past 2 years by domain

Discussion

The 2016 study published by the JTF [11] concluded that the workforce assessed itself as competent in the domains of Ethical and Safety Considerations, Clinical Study Operations (GCPs), and Leadership and Professionalism, but there were significant gaps in the self-assessed competency of individuals employed in the enterprise for other domains, irrespective of role, experience, or educational level. Those domains where lower levels of competency were noted were Scientific Concepts and Research Design and Investigational Product Development and Regulation. Competency levels in the domains of Study and Site Management, Data Management and Informatics, and Communications and teamwork were found to be dependent on role, educational level, and experience.

The findings of this study update and extend this past work in important ways. As with the 2016 study, the results of the 2020 study support the claim that the self-confidence of the clinical and translational research workforce is lowest in the domains of Scientific Concepts and Research Design and Investigational Product Development and Regulation. Similarly, the results of this study suggest that key portions of the workforce lack some confidence in skills associated with Data Management and Informatics and Ethical and Safety Considerations.

Like the 2016 study, the current work also found substantive differences in the self-confidence of respondents based on their role on a study team. Further definition of the JTF Domains to include Fundamental, Skilled, and Advanced levels of competency enabled their inclusion in the current study questionnaire. By the average rating, all member roles of the clinical research team rated themselves as functioning within the Skilled or Advanced categories in each domain; the 2020 results, however, suggest that both Educators/Trainers and Project Managers/Research Managers report comparatively high levels of confidence across domains compared to respondents whose role may include more direct contact with study participants. In the case of Project Managers/Research Managers, the self-assessments are higher than those of PI/CoI’s. Comparatively high levels of variation in the domain scores were reported by Regulatory Affair Professionals. And while 2020 respondents working in teaching-related academic organizations reported having a higher level of competency in Scientific Concepts and Research Design, those in corporate pharmaceutical organizations reported higher levels of competency in Investigational Product Development and Regulation and in Communication and Teamwork.

Key differences in the confidence of the workforce persist across education and experience, as was also found in the 2016 study. Specifically, respondents with higher levels of postsecondary education reported higher levels of self-confidence in their research skills, as did respondents who are professionally certified by ACRP and/or SoCRA compared to those who were not. Interestingly, 2020 respondents’ self-confidence in their research skills increased with experience, but not uniformly across all domains, with the scores in the domains of Clinical Operations (GCP) and Data Management & Informatics showing more variability with experience compared to the other domains. Taken together, these results suggest that the clinical and translational workforce may need further training not only in the domains of Scientific Concepts and Research Design and Investigational Product Development and Regulation, but also in the domains of Data Management and Informatics, and Ethical and Safety Considerations. In addition, this evidence indicates that the composition of clinical research teams should be carefully considered since researchers in specific roles may have considerable expertise in specific areas that their colleagues lack. For example, there may be opportunities for experienced Regulatory Affair Professionals working in corporate Pharmaceutical/Biotech organizations to impart their distinctive skills in Investigational Product Development and Regulation to their colleagues and for experienced Educators/trainers working in academic teaching organizations to do so in the domain Scientific Concepts and Research Design. The results of this study reinforce the value of formal education and ACRP/SoCRA certification in increasing researchers’ confidence in skills in every domain.

In order to inform new recommendations for further enhancing the professional development of the clinical and translational workforce, this study also assessed respondents’ professional development activities. Notably, only 35.7% of respondents reported participating in education/training in the past 2 years. Additionally, the respondents were least likely to report participating in training related to domains showing the lowest levels of self-confidence. Comparatively low percentages of respondents reported that they received training in Investigational Product Development and Regulation (28%) and Scientific Concepts and Research Design (30%), whereas higher percentages of respondents reported participated in training in Ethical Considerations and Safety (46%) and Clinical Study Operations/GCP (57%). Additional training correlated with the higher levels of reported self-confidence. This study may be used to guide the professional development of the workforce and to benchmark progress over time, but with certain limitations.

Limitations

The limitations of this study relate to both the methods and applicability of the findings. First, the survey was administered using a snowball sampling procedure given the inherent difficulties in identifying a sample of individuals that would be representative of the clinical and translational research workforce considered as a whole. While the use of snowball sampling was convenient, its use in this study necessarily has the potential to limit the reliability and validity of the results. For this reason, the recommended training needs identified in this study should be further validated by replicating this study within research organizations and groups before being used to guide relevant professional development or advancement opportunities provided to their members.

Second, there are inherent limitations associated with the use of self-assessments for the purpose of measuring clinical and translational research skills and knowledge since these types of assessments are not objective [15]. Specifically, although subjective and objective competency-assessment scores have been found to increase as a result of completing clinical research training programs [16,17,18,19,20,21,22], and as a result of accruing research experience [23,24,25], some empirical studies have also found few or no significant differences in self-assessed competence of more advanced compared to those of less advanced clinical research professionals [26,27,28]. Although the known groups validity or between-groups validity [29,30,31] of the assessment used in this study can be supported by comparing the results to those found by other validation studies of these types of assessments, rigorous empirical research demonstrates that the results of subjective and objective assessments of skills often diverge [32, 33]. This limitation not only further reinforces the need to replicate this study within clinical and translational research organizations and groups, but also to conduct and compare the results of self-assessments to objective assessments of research skills. To date there have been very few studies published of clinical research competency utilizing rigorous, objective, and reproducible methods of assessment. Such methods of evaluation may be utilized to assess competence in corporate environments, but they are not cited in the professional literature.

Conclusions

Over the past decade, there has been a significant increase in both the number and the complexity of clinical trials [34]. The result has been a growing global demand for clinical research professionals to support the workforce needs and has resulted in a severe shortage of personnel [35].

Clinical research is one of the few health professions where there is no entry level educational requirement. In addition, there is no required educational background or defined set of competencies that are required to become a clinical research professional [12] and a majority of the current workforce has been trained “on the job.” Although an increasing number of sponsors, clinical sites, and CROs have acknowledged that professional certification improves the level of competency [36] and are requiring professional certification for their new employees, this requirement is not yet standard across the field of clinical and translational science.

This dynamic has motivated efforts by many professional organizations to develop frameworks of defined competencies for the many roles within the clinical research enterprise [37,38,39,40]. The 2014 Joint Taskforce for Clinical Trial Competency (JTF) framework has been widely adopted and utilized to standardize role descriptions [2], define onboarding training and continuing education content [3, 41], inform upward mobility and promotion criteria [5], and define the content for academic programs which educate and train new CRPs [42].

The 2016 JTF global survey of clinical research competency [11] concluded that the workforce self-assessed as generally competent in the domains of Ethical and Safety Considerations, Clinical Study Operations (GCPs), and Leadership and Professionalism. These are the most common domains where continuous training is offered by professional organizations and required by regulatory bodies. The findings of the current survey demonstrate how self-assessed competency in the JTF domains vary by role, experience, education, certification, and organizational type of the respondents. The results inform the recommendation that training be provided in the domains of Scientific Concepts and Research Design, Investigational Product Development and Regulation, and Data Management and Informatics, irrespective of role, educational level, or experience. Equally important is the recommendation that clinical and translational research organizations and clinical sites assess training needs locally, using both subjective and objective measures of skill and knowledge.

The 2016 survey questioned the efficacy of the “on the job” training model and recommended that education in research methods should be required for physicians in their medical school curriculum and that clinical research coordinators and monitors be required to have basic education in the sciences prior to employment. The current survey validates those recommendations and those of others [43, 44] that training curricula include additional content relating to research methods, protocol design and medical product development and regulation, and extends those recommendations to include further training in data management and informatics. The JTF Framework will continue to inform, identify, assess, and address the need for relevant and rigorous training for the workforce.