The majority of people seeking mental health services do not receive treatments shown to improve clinical outcomes in rigorous randomized controlled trials, despite the accumulation of evidence and consensus for effective psychosocial treatments for psychiatric disorders (Ghandour et al. 2019; United States Congress Senate Committee on Health Pensions Subcommittee on Substance Abuse Mental Health Services 2004; United States President's New Freedom Commission on Mental Health 2003), and significant policy efforts to endorse, promote, and actively implement these evidence-based practices (EBPs). The field of implementation science was developed to better understand factors that facilitate or impede the implementation of EBPs and to develop and test strategies to increase effective implementation (Williams and Beidas 2019).

Though the field of implementation science has advanced considerably in the past two decades, implementation strategies—the interventions used to increase adoption, implementation, and sustainment of EBPs in health services (Brownson et al. 2017)—are still rarely developed in a systematic way that targets specific barriers to implementation (Powell et al. 2019). For example, a recent systematic review found that only 10% of trials testing an implementation strategy in mental health settings identified a priori the specific barrier the strategy addressed, or the mechanism through which the strategy was believed to improve implementation, and none of the trials that tested a mechanism or barrier supported the strategy’s underlying theory of change (Williams 2016). Additionally, there is evidence that traditional elicitation methods to identify barriers (e.g., qualitative interviews, surveys) often do not generate the actual barrier to behavior because people may not be consciously aware of the true impediments to their behavior (Lopetegui et al. 2014). This suggests that (a) many implementation strategies are not designed using a clear and scientifically-grounded theory of change regarding the specific behavioral barriers that impede or facilitate the use of EBPs, and (b) those that are designed with an underlying theory of change are not working in the way the investigators anticipated. Behavioral barriers are cognitive or psychological processes that operate prior to or during decision-making or behavioral enactment that impede or otherwise get in the way of achieving a target or focal behavior. Behavioral barriers are distinguished from structural barriers which represent factors external to the person that may impede on the person’s ability to select or enact a specific behavior. Behavioral barriers often operate outside of conscious awareness and are the focus of a large body of empirical research within the field of behavioral economics (Buttenheim et al. 2019; Datta and Mullainathan 2014; Spring et al. 2016). Behavioral barriers emerge at the intersection of psychological or cognitive factors and the context of the decisions and actions that comprise the behavioral target.

In the current study, we utilized a novel approach (called NUDGE for Narrow, Understand, Discover, Generate, and Evaluate) to rigorously identify behavioral barriers to EBP implementation in community mental health settings. NUDGE is a systematic design approach adapted by one author (AB) from existing behavioral economics, behavioral design and innovation methodologies (e.g., Asch et al. 2014; Council 2005; Darling et al. 2017; Datta and Mullainathan 2014; Kok et al. 2016; Tantia 2017) as a methodological advance in implementation strategy design. NUDGE was developed in this context for a parent project focused on applying insights from behavioral economics to advance the science and practice of implementing EBPs in community mental health. NUDGE distills barriers to an identified problem and identifies principles and constructs from behavioral economics that can inform implementation strategy design to address those barriers. A central thesis of behavioral economics is that people infrequently behave or make decisions “rationally” as most traditional utility-maximization models posit (Fiske and Taylor 2013; Kahneman and Tversky 1979; Tversky and Kahneman 1981). In contrast, people choose the best option given preferences, limited cognitive and attentional resources, and available information. The field of behavioral economics emerged from research on the limitations of human judgment and decision-making pioneered by Tversky and Kahneman, and offers novel ways to identify and leverage these “predictably irrational” (Ariely 2008) tendencies to design behavioral solutions. Examples of strategies that leverage insights from behavioral economics include strategically deployed financial incentives and “nudges” such as changing the default setting in an electronic health record to make it easier for individuals to do a desired behavior (Patel et al. 2016). Behavioral economic theories and methods, drawn from fields as diverse as cognitive psychology and economics, have been successful in identifying behavioral barriers to address public health challenges (Kooreman and Prast 2010), and have recently shown promise in solving implementation problems in medical settings (Patel et al. 2018). However, they have not yet been employed to aid implementation strategy design in mental health.

Despite evidence that tailoring strategies to obstacles may increase the effectiveness of implementations, implementation strategy development to enhance evidence-based practices is frequently not evidence-based (Baker et al. 2015; Powell et al. 2017). One way of addressing the implementation gap in mental health services, and the premise of the current paper, is to design implementation strategies using approaches that specifically delineate the exact behavioral barriers that impede EBP use in mental health settings. The current study utilizes the NUDGE method on a novel set of data (ideas generated by clinicians through a crowdsourcing challenge) to generate novel behavioral barriers and insights that inform implementation strategy design in community mental health. These identified behavioral barriers can then be directly leveraged to inform implementation strategy design.

Methods

Setting

We conducted this study within the context of a publicly-funded behavioral health system in Philadelphia County. The Philadelphia Department of Behavioral Health and Intellectual disAbility Services (DBHIDS) is a large publicly-funded behavioral health system that annually oversees the services provided to approximately 169,000 individuals. Since 2007, DBHIDS has supported the implementation of multiple EBPs through training, consultation, and internal staff time to coordinate implementation, training and ongoing consultation for five EBPs: Cognitive Behavior Therapy (Creed et al. 2014; Stirman et al. 2009), Trauma-Focused Cognitive-Behavioral Therapy (Beidas et al. 2016; Cohen et al. 2004), Prolonged Exposure (Foa et al. 2005), Dialectical Behavior Therapy (Linehan et al. 2006), and Parent–Child Interaction Therapy (Thomas et al. 2017). The Evidence-Based Practice and Innovation Center (EPIC) was launched in 2013 as a centralized infrastructure for the EBP initiatives, and to align policy, fiscal, and contracting processes for the delivery of EBPs, including an EBP “designation” process coupled with enhanced rates (Beidas et al. 2013, 2015, 2019b; Powell et al. 2016). This study is a part of a federally-funded research program applying behavioral economics to the implementation of evidence-based practices in community mental health settings, and is situated within one of the research projects focused on eliciting clinician preferences and processes around EBP implementation (Beidas et al. 2019a).

Data

The raw data used in our analysis were 65 proposed solutions to the problem of EBP underutilization submitted by 55 clinicians in Philadelphia’s publicly-funded mental health system (Stewart et al. 2019). Clinicians submitted their ideas in response to a system-wide, web-based, crowdsourcing challenge, called an innovation tournament that was conducted as part of the parent program described above (Beidas et al. 2019a). Innovation tournaments are a form of crowdsourcing designed to elicit divergent and novel solutions to complex and intractable problems by leveraging the direct experience, expertise, and practice wisdom of frontline providers and staff who work within a system and encounter the challenge on a highly frequent basis. Innovation tournaments have become increasingly popular and have been used in a variety of settings from healthcare, technology, law, and energy (Terwiesch and Ulrich 2009; Bjelland and Wood 2008; Jouret 2009; Mak et al. 2019; Merchant et al. 2014; Wong et al. 2020). “The Philadelphia Clinician Crowdsourcing Challenge” invited clinicians from the 210 publicly-funded behavioral health organizations in Philadelphia to submit an idea via email to the following prompt: “How can your organization help you use evidence-based practices in your work?” A total of 65 proposed solutions to EBP underutilization were generated and categorized into eight non-mutually exclusive categories: training (42%), financing and compensation (26%), clinician support and preparation tools (22%), client support (22%), EBP-focused supervision (17%), changes to the scope or definition of EBPs (8%), changes to the system and structure of publicly-funded behavioral health care (6%), and Other (11%). The tournament process and results are detailed elsewhere (Stewart et al. 2019).

Approach

Framework

We used the NUDGE framework for designing behaviorally-informed interventions (See Fig. 1). The NUDGE approach begins when investigators Narrow the focus of the analysis to a specific, relevant behavioral target. Next, investigators seek to Understand the context of the behavior through inquiry into the decision-making process and related actions. Investigators then Discover insights about barriers to the target behavior by uniting the rich contextual inquiry (practitioner-proposed solutions in the current adaptation) from the prior step with core principles (cognitive biases and heuristic thinking) from behavioral science in a structured brainstorming process around the cues, alternatives and meanings of the target behavior. These insights are used to Generate intervention strategies and designs and Evaluate those designs through iterative prototyping and trialing. In this paper, we report on the results from the Narrow, Understand, and Discover steps; we will complete Generate and Evaluate steps in future work.

Fig. 1
figure 1

Flowchart of hypotheses through the three step NUDGE framework chart

Analysis

The Narrow and Understand steps of the NUDGE process were undertaken in the Innovation Tournament described above. The target behavior was identified a priori as implementing EBPs with fidelity in community mental health settings. This study reports the results of the Discover step, which was carried out by a multidisciplinary team including licensed mental health clinicians; doctorally-prepared researchers in economics, public health, and psychology; and graduate student trainees in psychology, nursing, and behavioral and decision science. We identified and mapped barriers to successful implementation of EBPs in session (the behavioral target identified in the Narrow step), using a three-step process:

(1) Formulate hypotheses about behavioral barriers to the target behavior through a structured process that linked our understanding of the target behavior through the raw ideas with known cognitive biases and heuristics and other psychological elements that may be generating behavioral barriers. (2) Synthesize hypothesized barriers, and (3) Rapidly validate identified barriers through member checking, expert consultation, and literature review. Importantly, Steps 2 and 3 are iterative and were cycled through multiple times. The output of the process was a coherent, validated set of barriers to the target behavior (i.e., implementing EBPs with fidelity in community mental health setting) that can be used to generate focused, targeted implementation strategy designs. Below we provide a detailed explanation of our methodology for transparency and reproducibility.

Step 1: Brainstorm Hypotheses About Behavioral Barriers

Step 1 begins a process of structured brainstorming to identify behavioral barriers to the target behavior by linking information from the contextual inquiry to common biases and heuristics from the cognitive psychology, social psychology and behavioral economics literature. First, six investigators immersed themselves in the full set of ideas generated in the innovation tournament, and familiarized themselves with leading cognitive, social, and behavioral biases and heuristics from the psychology and behavioral sciences literature. There are many lists, references, textbooks, and taxonomies of common psychological factors (e.g., Behavioral Science Concepts, n.d.; Benson 2016; Health Interventions, n.d.; ideas422018; “List of Cognitive Biases” n.d.; Luoto and Carman 2014; Pinto et al. 2014; Samson 2017). Due to its usability and comprehensiveness, the authors utilized the Cognitive Bias Codex as the primary source for cognitive biases and other relevant constructs from behavioral economics (Benson 2016). Following immersion with the raw ideas and the Cognitive Bias Codex, investigators worked independently to formulate multiple hypotheses about possible behavioral barriers that impeded the target behavior (i.e., use of EBP with fidelity in community mental health settings). During the hypothesis brainstorm and formulation process, investigators drew from a set of prompts about the cues, actions and meanings, and alternatives for each decision or action step of the target behavior that helped to link the raw data (from the Understand step) to a specific behavioral principle in order to discover a novel insight about the target behavior. Prompts related to cues examined at what point a decision or action was prompted (e.g., how does the environment cue action or fail to cue action?). Prompts related to alternatives examined what other choices were available, or how easy it was to choose something else (e.g., How large or numerous is the choice set of possible competing behaviors?). Prompts for meanings reflected the participant’s identities that may be invoked by a particular action (e.g., What identities are associated with this behavior?”). There were two sets of prompts, one set focused on the decision or intention before a session to use EBP, and one set focused on the action and deployment of EBP in the session. Drawing from these prompts, the raw data and the Cognitive Bias Codex, the investigators formulated specific hypotheses about barriers to EBP implementation related to empirically-validated cognitive biases and heuristics. At the Discover step, the goal is to brainstorm as many hypotheses as possible in an effort to uncover a broad and comprehensive range of possible barriers.

For example, numerous ideas submitted to the tournament proposed checklists, “one-pagers,” or other decision aids that a clinician could bring into the session. This suggested that one barrier to delivery of evidence-based treatments is remembering the multi-step protocolized repertoires of behaviors required for high-fidelity therapy. An underlying psychological principle is Miller’s Law (Miller 1956; Robert 2005), which states that the human brain can only retain 3–7 items in working memory. These insights yielded multiple specific insights and hypothesized barriers related to the difficulty of executing complex protocols, and the high cognitive load on therapists, particularly when cognitive and attentional resources are scarce.

Step 2: Synthesize Hypothesized Barriers

Because the investigators worked separately and did not limit or curtail brainstorming at this stage, many hypotheses were duplicated across sets. Additionally, some hypotheses could be combined with other hypotheses to form an overarching theme. In Step 2, the full set of hypothesized barriers generated in Step 1 by the six investigators were de-duplicated and synthesized by two investigators (AB, RS) in an iterative process that included both independent and collaborative steps to ensure rigor and validity. Differences were resolved through discussion and consensus. Step 2 was completed iteratively with Step 3.

Step 3: Rapidly Validate Identified Barriers

To ensure trustworthiness and reproducibility of the findings, the researchers subjected their final list of hypotheses to a validation process including expert consultation (two licensed psychologists), literature review, and two interviews with clinicians who had participated in other aspects of the larger project. Interviews were conducted using a structured interview guide that elicited feedback from clinicians on the EBP implementation process that was relevant for assessing face validity of specific hypotheses. Step 3 was completed iteratively with Step 2.

Results

In Step 1, the six investigators separately generated 156 hypothesized behavioral barriers to the focal behavior (range: 18–35 hypotheses per investigator). Two investigators (RS, AB) proceeded with Step 2, independently de-duplicating and synthesizing the full set of 156 hypotheses; one investigator (AB) identified 41 discrete hypotheses, the other (RS) identified 29 discrete hypotheses. The lists were combined and further synthesized through discussion down to 21 hypotheses.

Step 3 (rapid validation) proceeded as described above. Two clinical experts (first author RES and a licensed clinical psychologist who has watched 10,000 h of videos of community clinicians engaged in session and is an expert in The Therapy Process Observational Coding System for Child Psychotherapy Strategies) reviewed the hypotheses for face validity. The first author (RS), a licensed psychologist and expert in the clinician decision-making literature, conducted a literature review for evidence of these hypotheses in the published scientific literature. Two interviews with community-based clinicians who had previously engaged with the parent project were conducted to elicit further input on clinician’s EBP experience, with prompts focused on themes relevant for validation. Ultimately, 16 of the 21 hypotheses were validated by 1 or more validation methods. Two investigators then returned to Step 2 for further synthesis. For example, through steps 2 and 3, “EBP seems good for other clients, but not for mine,” “My clients are different,” and “EBPs don’t work with this population” were combined into the one hypothesized barrier (i.e., EBPs don’t work here). Similarly, the hypothesized barriers “my plan to do an EBP goes out the window when the patient comes in upset” and “when talking to a client there is so much going on I can’t fit my EBP techniques in” were synthesized into one barrier (i.e., Just get through the session). The final hypothesis “What I do works” arose from a synthesis of hypothesized barriers such as “I’m not an EBP person” and “I meet the client where s/he is instead of following procedural details.” Our final list of six validated hypotheses is organized along the temporal continuum implicit in our target behavior from deciding or planning to do EBP through preparation and real-time deployment of the selected EBP during session. Figure 1 summarizes the hypothesis distillation process from 156 to 6. The six hypotheses, the associated cognitive biases, and the evidence from the contextual inquiry are described below and summarized in Table 1 and organized in two categories: Planning to do an EBP, and Deploying EBPs. The last column of the Table describes potential implementation strategy designs that might emerge in the Generate phase of NUDGE, which will follow the Discover phase detailed in this paper.

Table 1 Behavioral barriers and corresponding behavioral insights gleaned from contextual inquiry data

Planning to Do EBP

This category encompasses barriers related to forming the initial plan to do EBP in a session. These barriers emerge before a session when a clinician is deciding or planning whether or not to engage in EBP during the session. We found four hypothesized barriers that emerge at this stage.

EBPs Don’t Work Here

A striking behavioral barrier that emerged at the planning step is resistance to EBP due to a belief that EBPs don’t work for a community population or that efficacy data do not generalize to clinical practice in the community. A common feature found in the tournament ideas were requests for demonstrations that EBPs are applicable and relevant to the populations seen in community clinics. For example, one idea suggested the development of a database of EBPs for trans-gendered populations. Another participant submitted an idea about an organization supporting case conferences of employees showing how EBPs work with that organization’s clients. These ideas link to the cognitive bias of base rate fallacy (ignoring background probabilities in favor of event-specific information; Bar-Hillel 1980) and anecdotal fallacy [generalizing from possibly isolated incidents; (Nisbett et al. 1985)]: clinicians may over-anchor on a small number of client experiences in which an EBP was not appropriate or effective. Confirmation bias (selectively seeking or attending to information that confirms existing beliefs; Nickerson 1998) can also reinforce the “EBP’s don’t work” barrier.

What I Do Works

The counterpart barrier to “EBPs don’t work” is a strongly-held belief that “What I do works.” Our mapping process revealed that some clinicians’ professional identities or therapeutic mindsets don’t include EBP use, impeding EBP planning. This hypothesis was motivated by ideas that suggested training opportunities focused on the “art” of therapy to enhance the building of rapport and client engagement and serving as a patient advocate, in contrast to training on the formalized techniques and skills more characteristic of EBP protocols. Other tournament ideas suggested that less emphasis should be placed on one-size-fits-all implementation of EBPs if current therapeutic approaches were working better. The “What I do works” barrier highlights the strength of mental models (deeply-held beliefs about how things work that help you interpret your world) (Johnson-Laird 2010) and identity priming (when one’s identity influences a response to a stimulus) (Benjamin et al. 2010). Clinicians’ experiences, dating back from their graduate training through current clinical practice, produce a mental model about the efficacy or appropriateness of therapeutic practices in which they are more trained and comfortable. Finally, status quo bias (Kahneman et al. 1991) suggests that clinicians will stick with current therapeutic modalities as the default unless motivated or incentivized to train in and deliver EBPs.

No One Else is Doing EBP

We know that humans are highly influenced by social norms (Meeker et al. 2016; Sherif 1936), both descriptive (what people actually do) and injunctive (what people think they should do), and clinicians are no exception. Our third hypothesized barrier to forming a plan or deciding to implement EBPs was the perception that other clinicians (and the agency more broadly) do not value or engage in EBPs. Multiple tournament ideas suggested that clinicians increase the visibility of their EBP use or support, or that agencies more strongly signal that they value EBP use, for example by providing EBP-specific supervision. Clinicians’ requests that their peers actively demonstrate use of EBPs suggests that it is important to them to follow socially normative practices. In a busy clinical environment with heavy demands on cognitive and attentional bandwidth, the salience (Kahneman 2003) of EBPs and EBP use is also low. When a practice such as EBP use does not appear normative and is low-salience, status-quo bias (Kahneman et al. 1991) may also be activated.

Deploying EBPs

This category consists of barriers that emerged at the moment of preparing to execute EBPs in session or at the actual moment of execution in session.

I do not make a plan

Multiple ideas in the tournament proposed paying clinicians for session preparation time and other incentives to prepare for EBP sessions. This indicated that a potential barrier to EBP delivery in session is a lack of time to plan and prep prior to the sessions. Clinicians may hold “EBP mindsets” and intend to do EBPs but forget or change their mind in the moment. This is related to the behavioral insight of prospective memory failure (Brandimonte et al. 1996), or forgetting to perform an intended action at the right time. This may also relate to hassle factors (Bertrand et al. 2004); clinicians may want to do an EBP session but lack time to do the necessary planning especially with a full load of patients. These hassle factors may prevent them from executing an EBP in the session.

EBPs are Hard

A consistent theme in the literature on EBPs in mental health practice that was strongly echoed in our innovation tournament was the fact that EBP delivery, particularly with fidelity, is difficult. Ideas submitted through the innovation tournament suggested helping to break down EBPs into small, doable steps with decision-aids, reminders, and memory aids including “one-pagers,” checklists, and short demonstrative videos. Our insight into this barrier is informed by behavioral principles related to cognitive load (Bertrand et al. 2004; Paas and Van Merriënboer 1994), or the very real demand on a person’s mental resources from executing complex multi-faceted, multi-component procedures such as EBPs while dealing with complex data. Even a well-trained clinician with strong EBP intentions has to make decisions about the execution of techniques in session that are not specified in any EBP manual. This uncertainty may elicit a negative affective response (Slovic et al. 2002) and result in an avoidance of the technique. This avoidance may be fueled by ambiguity aversion (Akerlof 1991; Karlsson et al. 2009), which is a preference for actions that leads to outcomes that are known or certain versus those that are unknown or risky (such as embarking on an EBP technique with limited time left in the session).

Just Get Through the Session

The final barrier emerging in the execution stage is “getting through the session,” which was informed by ideas proposing strategies to prepare and calm clients prior to the session, such as a suggestion for a more relaxing waiting room, or funding of case managers to shift social services work outside of the therapy room. Our insight into this barrier is informed by hot–cold empathy gap, which is the phenomenon that people have difficulty predicting how they will behave in future affective states that are different from their current state (Loewenstein 2005). Clinicians may have difficulty executing an EBP while they are in session with an emotional patient, and may also persistently fail to remember (when in “cold state”) how disruptive the client and the clinician’s “hot states” will be to effective delivery of the EBP. This can lead to being unprepared or underprepared to bring a session back to EBP delivery if it goes off track, or being overoptimistic or overambitious about the ability to delivery EBP with fidelity in a session where both client and clinician are in a “hot state”. At the same time, the clinician in the middle of a session with a challenging client or a challenging situation is not able to recall the plans, intentions, and coping strategies that the “cold state” self put in place for the session.

Discussion

To our knowledge, this is the first study aimed towards developing implementation strategies directly from proposed solutions to infer behavioral barriers that might impede the implementation of EBPs. We applied a novel methodology informed by behavioral economics and used data from a diverse swath of clinicians engaged through an innovation tournament. We used these ideas as our raw data, as signals or cues to identify the behavioral barriers (and corresponding cognitive biases) to the implementation of EBPs. These results deepen our understanding of clinician decision-making with a lens towards cognitive biases, facilitating our ability to design targeted and novel implementation strategies.

Consistent with other studies (Beidas et al. 2015; Stewart et al. 2012), the behavioral barriers we identified through NUDGE largely fell into two broad categories: those related to making a plan to do EBP and those related to execution of the EBP in session. These different categories of behavioral barriers (and corresponding cognitive biases) may represent different populations of clinicians, and will require different strategies for intervention. Specifically, clinicians who are encumbered by attitudinal barriers related to making a plan to do EBP may lack the desire or motivation to use EBP. The implications for design (in the forthcoming Generate phase) are that strategies should seek to disrupt clinicians’ current mental models of what EBP signifies (e.g., by leveraging the “vividness effect” (Taylor and Thompson 1982) via realistic, detailed, and emotionally compelling studies) or to make descriptive or prescriptive norms more salient and persuasive via social comparison (Meeker et al. 2016).

The problem of EBP usage is different for clinicians who are encumbered by behavioral barriers related to the execution of the EBP. The challenge for this group is not attitudinal, but rather aiding these clinicians to take action to meet their espoused EBP goals as they describe them. This may require concrete planning, coping, and checklist strategies because objectives are more likely to be achieved when they are accompanied by simple specific planning (Casper 2008; Gollwitzer 1999; Milkman et al. 2011). Effective strategies for this group developed in the Generate stage might also target hassle factors and cognitive load, helping therapists to “chunk” sessions into manageable and simple techniques.

The present study illustrates the application of NUDGE, a systematic theoretically-informed approach to the identification of behavioral barriers informed by behavioral economics, that can also accompany conceptual frameworks that traditionally guide practice and research within implementation science (e.g., CFIR; Damschroder et al. 2009). NUDGE fills a gap in the implementation science literature related to tailored design by showing how raw inputs from frontline practitioners in community settings can be leveraged to generate robust, theory-informed behavioral insights about the drivers of implementation behavior which directly inform strategy design. A key contribution of NUDGE, distinct from intervention mapping and other traditional contextual inquiry approaches (e.g. Kok et al. 2016), is the explicit linking of well-characterized cognitive biases and heuristics to contextual data in order to identify specific barriers to the desired behavior that can inform intervention targets.

The use of raw stated preference data as a starting point from which to deduce behavioral barriers that can be targeted with implementation strategies is an essential feature of NUDGE. Why not take participants’ stated preferences at face value? One reason is that asking people for ideas or to report on barriers does not always yield accurate impediments to behavior (Asch and Rosin 2015); however, participants’ responses can help us to generate insights into behavioral barriers by revealing underlying assumptions, cognitive biases, or heuristics that participants employ in explaining the behavior. Our approach differs from traditional contextual inquiry in that we systematically investigated the implicit cognitive biases and heuristics that may contribute to implementation challenges. In other words, we are not aiming to implement the preferences themselves but rather implementation strategies informed by behavioral insights that the preferences reveal.

NUDGE offers an opportunity to build implementation strategies from the ground up, supporting an in-depth understanding of clinicians’ real difficulties and designing implementation strategies that accurately map back to these challenges. This approach goes beyond simply eliciting stakeholder preferences, but instead gave us perspective on where barriers appear to be getting in the way of doing EBPs, and provides insight on what we should do to overcome these barriers. This rigorous process was able to reverse engineer/translate ideas into hypotheses to identify salient barriers to behavior. Importantly, some of these biases can be harnessed through implementation strategy design to encourage more evidence-based practices by reframing the context of the choice.

We gleaned a rich set of hypotheses through this behavioral design process. Many of the hypothesized barriers have non-incentive related design implications (e.g., strategies that make EBPs easier to use). Additionally, many of the ideas in the tournament pertained to training strategies, yet, clinicians in Philadelphia County have more training than the average community clinician due to a decade of system-supported training initiatives that have trained hundreds of clinicians (Powell et al. 2016). We believe that we captured through our diagnosis that training is not always the true preference, but emerges downstream from a thought process such as, “I guess I don’t use EBP because I don’t feel prepared. How could I be better prepared? Oh, I guess I must need more training.” In contrast, we suspect that additional training would not enhance this competence or confidence, but simplification strategies (Service et al. 2014) such as skill-building techniques, preparation tools, reminders (Karlan et al. 2010), or aids to make implementation easier (and reduce cognitive load) may improve this sense of mastery and therefore execution. This again speaks to the strength of this study which is going beyond the ideas or suggestions at face value, similar to the “5 Whys” or “5 So Whats” used in process improvement, which is a technique of asking (5 times) why the failure or problem has occurred as a way to get to the true root cause of a problem (Arnheiter and Maleyeff 2005).

The NUDGE process is most useful for uncovering behavioral barriers to desired actions. In the context of mental health care, these actions might be taken by patients or clinicians, or by administrators of health care delivery systems. In the current study the investigators narrowed our focus (the first step of the NUDGE process) to the clinician’s decision to use an EBP, and the related action of deploying the EBP in session. The resulting behavioral barriers were therefore also focused at that level, although implementation strategy designs that emerge from this analysis may require organizational change (or at least leadership buy-in and commitment) to put in place. One promising area for future research is whether NUDGE can be helpful at the level of organizational decision-making and for identifying behavioral biases at the organizational level (Behavioral Insights Team 2017). Examples of target behaviors to focus on might be how payers and organizational leadership select and resource professional development activities for clinicians, or make decisions about reimbursement policies and incentive schemes.

Some limitations should be noted. Our greatest strength is potentially a limitation in that we were inferring problems based on suggested solutions (which are still self-report). Second, these ideas were from community clinicians in one system in one part of the country; thus, the inferred behavioral barriers and insights may not generalize to community behavioral health at large. The value of NUDGE is that it can be applied to unique contexts to identify behavioral barriers for targeted populations of individuals; the extent to which behavioral behaviors to EBP implementation identified in one population generalize to other populations remains an open empirical question. Third, we recognize that there are hundreds of EBPs from which to choose, and we did not focus on that choice, but further downstream after a particular EBP was selected. Lastly, NUDGE is designed specifically to only address behavioral barriers rooted in cognitive biases and heuristics. By definition then, this approach cannot address structural issues and barriers (e.g., scarcity of resources) of which there are many in publicly funded behavioral health (Beidas et al. 2015; Skriner et al. 2017).

Conclusion and Future Directions

We gleaned a rich set of six hypothesized barriers to EBP implementation through the innovative, hybrid nature of our contextual inquiry, and utilized a novel method, NUDGE, to inform implementation strategy design. This study represents first steps in thinking how a behavioral design process can inform implementation strategy development in mental health care settings. By enhancing methods for rigorous implementation strategy design such as NUDGE, we can better tailor implementation strategies to advance the field of implementation science. Future directions will include the transformation of these hypotheses into testable implementation strategies, such as incentivizing paid pre-session prep time. We are optimistic about continued application of the approach within implementation science to design relevant and impactful approaches to improve the quality of services in behavioral health.

Availability of data and materials

Data will be made available upon request. Requests for access to the data can be sent is the Penn ALACRITY Data Sharing Committee. This Committee is comprised of the following individuals: Rinad Beidas, PhD, David Mandell, ScD, Kevin Volpp, MD, PhD, Alison Buttenheim, PhD, MBA, Steven Marcus, PhD, and Nathaniel Williams, PhD. Requests can be sent to the Committee’s coordinator, Kelly Zentgraf at zentgraf@upenn.edu, 3535 Market Street, 3rd Floor, Philadelphia, PA 19107, 215-746-6038.