Evidence-based practice (EBP) is an integrative decision-making process aimed at improving client outcomes and effectiveness in social work practice (Drisko and Grady 2012). Yet, for many reasons, social workers continue to struggle to understand EBP and to integrate it into agency and individual practice settings. This article will examine some of the current challenges that practitioners and educators have raised regarding EBP, identify some of the reasons to use EBP, and make some recommendations on how the profession can more effectively infuse EBP in social work education and practice.

The Definition of Evidence-Based Practice

What Evidence-Based Practice Really Is

Over the past 20 years, EBP has emerged as a process for integrating research evidence into clinical practice in combination with the client’s needs, values and preferences, as well as the clinician’s professional expertise. There is no disagreement across the core health professions on what EBP is, or about its definition (Academy of Medical-Surgical Nurses 2014; American Speech and Hearing Association 2015; American Psychological Association 2015; Guyatt et al. 1992; National Association of Social Workers 2010; Sackett et al. 2000). Yet social work practitioners (Simmons 2013) and social work academics (Rubin and Parrish 2007) are often unclear about just what EBP actually is. Further, the way administrators and payers use the phrase ‘EBP’ has significantly undermined clarity about just what EBP really is and has often omitted some of its core components.

The current definition of EBP follows that of evidence-based medicine (EBM). Sackett et al. (2000, p. x) at McMaster University in Canada define EBM as “the integration of best research evidence with clinical expertise and patient values.” In social work, Rubin (2008, p. 7) states similarly that “EBP is a process for making practice decisions in which practitioners integrate the best research evidence available with their professional expertise and with client attributes, values, preferences and circumstances.” The definition of EBP is actually very clear; it is just not effectively taught, nor well understood. It is also rarely practiced in full.

The EBP process has four equally weighted parts: (1) current client needs and situation, (2) the best relevant research evidence, (3) client values and preferences, and (4) the clinician’s expertise (see Fig. 1). Note that clinical expertise is where client needs, clients’ preferences and research knowledge are all integrated. Despite a heavy focus among academics and payers, the best research evidence is just one of four parts of the EBP process. In practice, client values and preferences, and clinical expertise are critical to doing EBP correctly. EBP is not solely about research.

Fig. 1
figure 1

The four parts of evidence-based practice from Haynes et al. (2002)

For many professionals, it is easy to understand the EBP process if we apply it to a hypothetical situation about personal medical decision-making. If a checkup reveals cancer, an oncologist should help us understand what treatment options would likely be effective based on the best available research knowledge. Here research knowledge helps define the several assessment and treatment options that are most likely to be safe and effective. We would want to know the likely benefits and risks, and side effects, of each treatment option to help decide which one we would prefer. We would want to do this in active and ongoing collaboration with the clinician, whose expertise helps clarify things we might not understand or consider. We might reject options that have good research support for reasons of personal preference or beliefs, or because of the side effects of the treatment. In the end, the patient has to decide which course to take—even that of no treatment at all. EBM and EBP are not simply dictated by research evidence, though this evidence is a vital part of making and weighing informed choices about treatment options.

EBP in social work practice follows this same format. Research evidence is one important part of a complex, collaborative, treatment planning process. The steps and process of EBP have been clearly articulated by social workers and by other health and mental health professionals (Drisko and Grady 2012; Norcross et al. 2008; Rubin and Parrish 2007; Sackett et al. 2000).

What EBP is Not

EBP is a collaborative process for making treatment decisions. Yet based on Simmons’ (2013) national survey, the vast majority of practicing social workers in the United States think that EBP is about selecting treatments from a limited list of research supported and payer endorsed treatments. This view appears to reflect how payers use EBP administratively to limit mental health care costs, but it conflates EBP with some other concepts. Empirically supported treatments (ESTs), also called empirically supported interventions (ESIs), or research supported treatments (RSTs), are treatments that have some research support. Note that the focus here is on establishing that a specific treatment works better than no treatment and with little harm. Establishing that a treatment is effective and has relatively few side effects is vital to the research evidence part of EBP, but is not the same as the entire EBP process. Practitioners, payers and even academics often confuse these ideas. Administrative practices may make it appear that EBP is using payer-defined treatments. This, however, is not consistent with the actual definition of EBP.

A simple way to conceptualize the differences between EBP and ESTs is that EBP is a process and ESTs are products. The EBP process may include a decision that an EST is the best course of action, for example using Prolonged Exposure Therapy to decrease avoidance in trauma cases. However, the EST product (an established treatment protocol) must be considered along with three other factors in the EBP process. The client, based on personal values and preferences, may decide such an intervention is not right for him/her. Or perhaps the clinician has concerns that the client might find prolonged exposure overwhelming.

Psychologists Chambless and Hollon (1998) define an EST as a treatment demonstrated to be better than no treatment in at least two experimental studies. In addition, the treatments must be standardized using a manual, and to limit bias, researchers other than the creator of the treatment must do at least one of the two studies. Yet insurance companies and government payers may set different standards for determining their list of ESTs or authorized treatments. In fact, lists of designated ESTs vary widely among payers. Further, EBP is not a list of ‘best practices.’ The term ‘best practices’ appears to have no standard definition. What authors label as a best practice may draw on widely varying kinds of research evidence—or little research evidence. Clinical social workers need to understand the EBP process and distinguish it from payer and administrative lists of treatments. EBP is a process undertaken directly with a client or client system.

The Current State of EBP Impacting Clinical Social Work Practice

Health Care Costs, EBP, and Practice

Economic and administrative pressures have made EBP an important shaping influence on contemporary clinical social work practice. To ensure service quality, while reducing costs, payers, politicians, insurance administrators and others advocate for EBP and ESTs. However, the way these advocates use and describe EBP is often at odds with the four parts of the actual EBP process. Drawing on the medical model of EBM, socially complex interventions, including psychotherapy, are often presented by administrators and researchers using a ‘drug metaphor’ (Shapiro et al. 1994). The drug metaphor approach assumes (a) that the ‘active ingredients’ of change are solely found the therapist’s behavior and that the client is essentially passive, (b) that these ingredients are well known and fully specified, (c) that names applied to these active ingredients are applied consistently and (d) that processes and outcomes form a simple cause-and-effect relationship. The drug metaphor assumes psychotherapy is much like taking an antibiotic. Thus, most researchers assume it is appropriate and valid to compare therapies without regard to differences in client motivation, personal histories, or available social support systems. Researchers may also assume that psychotherapies using the same name are actually the same in techniques, duration and emphasis, even when delivered by different professionals to varying populations in different settings. Despite considerable, some say overwhelming, evidence that the drug metaphor is too simple for socially complex interventions like psychotherapy (Norcross 2011; Wampold 2001) the same medical ‘drug’ model is used in most psychotherapy outcome studies that serve as evidence in EBP. Research evidence based on simplistic, reductionistic studies constitutes the justification for limiting available services to limit costs.

While social workers understand the importance of limiting health care costs, they also support client’s rights and autonomous choices, and understand the importance of the therapeutic relationship to effective services (Drisko 2004, 2013). Clinical social workers know psychotherapy is an extremely complex process that they must tailor to individual needs and preferences consistent with the actual EBP process. Clinical social workers often question the validity and comprehensiveness of much psychotherapy research. They are willing to use ‘the best available research’ but often view it as unrealistically simple and quite unlike the needs, backgrounds, situations and preferences of typical clients. To maintain professional integrity, many clinical social workers evade administrative requirements, implicitly questioning that administrators’ use of EBP best serves client needs (Arnd-Caddigan and Pozzuto 2010). Evidence based practice can becomes a public relations gimmick, a window dressing, more than the intended process to improve clinical practice.

Payers, administrators and supervisors should encourage use of treatments with a solid research evidence base. But the lists of authorized treatments both must conform to the highest standards of research—conceptual and methodological—and then be included in the EBP process rather than essentially overriding both client choice and clinical expertise. EBP is a practice process, not an administrative one (that would inherently exclude client preferences and diminishes clinical expertise).

If EBP is not what payers and administrators seem to claim it is, what is going on? The phrase ‘evidence-based practice’ has been widely connected to payer efforts to simultaneously reduce health care costs and improve health care outcomes. These are both important and worthy goals. The problem may be one of interpretation: What payers are doing is using the window dressing of EBP to initiate measures to control health care costs. Yet restricting clinicians from using treatments other than those on ‘approved’ lists is inconsistent with the client choice and clinical expertise aspects of the actual EBP process. These administrative distortions of EBP also appear inconsistent with social work’s core professional values. Therefore, a critical challenge for our profession as is how to train practitioners to use EBP in a way that is both consistent with our professional values and improves client outcomes.

Academics and EBP

As stated earlier, several scholars have challenged social work to integrate EBP more fully into the profession (Gambrill 2006; Mullen et al. 2008; Rubin 2008). This challenge has led to some concrete changes in social work education, such as the inclusion of ‘evidence-informed practice’ in Council of Social Work Education’s 2008 and 2015 Educational Policy and Accreditation Standards (EPAS) along with a greater emphasis on practice research. Note carefully that CSWE’s unique terminology may lead to additional confusion about the actual definition of EBP. Other concrete changes include the heightened attention that some schools of social work have given to EBP by creating specializations in EBP or by shifting curricula in practice courses to highlight only ESTs.

In spite of these changes in emphasis, several studies continue to demonstrate that social work academics often do not include EBP in their teaching. A national study of social work programs, along with psychology and psychiatry training programs, demonstrated that social work offered the least training in research based interventions of the three professions, either in the classroom or in supervision (Weissman et al. 2006). Bledsoe et al. (2007) found that the majority of social work schools (62 %) did not require didactic content and clinical supervision in any empirically supported psychotherapy (EST). Although these studies were conducted prior to the 2008 EPAS changes, more recent research indicates that social work educators continue to fail to “incorporate the available research into their curriculum decisions” nor do they integrate the process of EBP into the curricula (Grady et al. 2010, p. 475). Grady et al. note “the persistent effect of this lack of training in EBP is the creation of an intergenerational cycle of lack of knowledge and experience in EBP and ESIs” (p. 476).

Part of the struggle for academics in incorporating EBP into the training of social workers may be that many academics remain confused about the definition of EBP (Grady et al. 2010; Rubin and Parrish 2007). The confusion among social work educators obviously contributes to the current confusion of social work practitioners who graduate from their programs (Simmons 2013). Recent research demonstrates that the confusion around the definition of EBP persists among recent MSW graduates—even those who attended programs after the 2008 EPAS changes (Grady et al. under review). This research also indicates that their employing agencies, their administrative and their clinical supervisors are also confused about EBP. The practitioners in this study reported that this wide-spread confusion then has a ripple effect: Practitioners do not receive adequate supervision and training that infuses EBP as a process into daily practice, which leads to a lack of integration of EBP into the workforce, and in turn, this knowledge is then not passed on to the next generation of social workers.

Practitioners and EBP

Even when practitioners do have a clear definition of what EBP is, and receive additional support in using the process, many report that they still find it overwhelming to use in practice (Bellamy et al. 2008; Wike et al. 2014). There are several practical and logistical challenges for social workers doing EBP. These include access to up to date research literature (Bellamy et al. 2008; Proctor et al. 2007) and limited confidence in their abilities to interpret the research (Bellamy et al. 2008; Bledsoe-Mansori et al. 2013). In addition, time pressures due to high caseloads and other work demands (Murphy and McDonald 2004; Nelson et al. 2006; Proctor et al. 2007) make doing EBP difficult in practice. In addition, social workers have also noted broader philosophical challenges to EBP. These include whether EBP can address the nuances and complexities of practice (Bellamy et al. 2008; Brekke et al. 2007; Murphy and McDonald 2004; Nelson et al. 2006; Pollio 2006; Proctor et al. 2007; Rosen 2003) and whether the available research is relevant to their clients and to their practice (Bellamy et al. 2012; Wike et al. 2014). Furthermore, many social workers question whether we are educating practitioners to be mechanics and atheoretical in their work rather than autonomous professional experts (Bellamy et al. 2008). A consistent theme across these studies is whether social workers are being asked to become more ‘medicalized’ and more removed from our own professional values and teachings.

An addtional concern is whether EBP and ESTs include attention to the therapeutic relationship (Drisko 2013; Graybeal 2014; Proctor et al. 2007). This debate centers around whether the relationship between the client and the practitioner impacts practice outcomes (Grady and Keenan 2014). Yet EBP does not inherently negate the research on the strength of the therapeutic relationship, nor does it ignore the importance of the relationship at different stages of the EBP process, including most importantly the assessment phase (Grady and Drisko 2014). The focus of EBP research on aggregated treatment outcomes does, however, shift attention away from how process and outcomes are connected in more detailed and specific ways. This shift in focus is often interpreted as a lack of interest or attention to the process, specifically the therapeutic relationship.

Wike et al. (2014) provide a summary of the literature on social worker practitioners’ attitudes towards EBP, which are mostly negative. Much of the research they cite indicates that social workers equate EBP with ESTs. They state,

Some clinicians continue to express negative attitudes about EBP, due to the belief that ESIs [ESTs] acquired through the EBP process require clinicians to disregard clinical experience, empathy, and creativity in order to make practice decisions based exclusively on research evidence which may be irrelevant. (p. 164)

Many scholars have noted that if the profession could make clear that EBP is an interactive process requiring considerable critical thinking rather than just the use of a specific EST/ESI, then many clinicians might be more open to using EBP in their practices (Barth et al. 2011; Borntrager et al. 2009; Rubin and Parrish 2007; Wike et al. 2014). Effective education, providing a correct explanation of EBP, could help support its use in clinical practice.

Further Challenges to Doing EBP in Practice

How are Client Preferences and Clinical Expertise Included in EBP?

Beyond increasing an accurate understanding of EBP and supporting it with funding and training, some additional issues exist. Just how client preferences and values are understood in practice warrants much more elaboration and professional discussion. Gilgun (2005) points out that neither client preferences nor clinician expertise are well defined in EBP or EBM, and are not emphasized in the literature on EBP. Client preferences, a key part of EBP, are likely to be ignored by payers, administrators and clinicians. Social workers should surely endorse the importance of collaborating with clients, and of supporting their autonomy and active involvement in treatment decision making, consistent with our core professional values [National Association of Social Workers (NASW) 2008]. Gilgun (2005) also notes that clinical expertise is not elaborated in the EBM/EBP model. In many respects, payers’ use of fixed lists of approved treatments actively contradicts client values and preferences as well as clinical expertise as meaningful elements of EBP. No wonder clinicians have difficulty understanding EBP: It is misused as a label for a process that actually undercuts both client choice and clinical expertise (Goldenberg 2009; Groopman 2010).

Finding and Evaluating the Best Research Evidence?

While strong international efforts, such as the Cochrane Collaboration and the Campbell Collaboration, have sought to provide ready access to summaries of high quality experimental research knowledge, clinical social workers still note that accessing such knowledge is time consuming and difficult (Wike et al. 2014). Many problems and treatments of interest to clinicians simply lack experimental outcome research. Such treatments are not necessarily ineffective, they are as yet simply not sufficiently tested. Even when studies are located, social workers often find it difficult to access and critically evaluate the quality of individual research studies. They also note a lack of such expertise is common among their supervisors (Drisko and Grady 2012). Most studies of mental health treatments are based on very small samples. Several authors note the astonishing lack of inclusion of diverse populations in research articles and systematic reviews (Drisko and Grady 2012; Zayas et al. 2011). Finding relevant research is difficult and determining its quality and relevance to a particular client may also be challenging.

Are Modified Treatments Consistent with EBP?

Another issue is whether modifying a strongly RST still constitutes use of an EST (Drisko and Grady 2012). Modifications to treatments are common in practice due to limited resources or the lack of appropriately trained clinicians or programs. For example, can agencies providing only part of dialectical behavior therapy (DBT) and still rightfully call it DBT? Can a clinician without formal DBT training and certification provide DBT? Without knowing the specific curative factors for each therapy, it is not clear if it is appropriate to alter any defined therapy. Modifications to manualized treatments appear to generate new and untested treatments; they do not automatically remain comparable to the original treatment. Issues of distorted or false advertising about treatments, and about the competence of clinicians to deliver named treatments may be grounds for legal action by clients against agencies and clinicians. This issue warrants professional attention and careful study.

Why Do EBP?

With all of these critiques and challenges, readers might be asking, so why do EBP? In our opinion, EBP is a process aimed at seeking better outcomes for clients. By incorporating all of the aspects of EBP, practitioners adopt a holistic approach to working with clients, which is consistent with social work’s professional values. However, the irony of EBP is that there is a lack of evidence as to whether there are actually better client outcomes when agencies adopt a culture of EBP and use the process consistently (Drisko and Grady 2012).

Another reason to use EBP is that it intentionally and actively involves clients in the treatment planning process. Their preferences, values, and circumstances are all included in the decision-making process, which is again consistent with social work values regarding client autonomy and valuing collaborative relationships. Similarly, by using EBP with the clients, clinicians are more likely to create client ‘buy-in’ to an explicit treatment plan with measurable outcomes. By having such a plan, both the client and the clinician can track the progress of the intervention and make adjustments when needed. The NASW Code of Ethics (2008) states that social workers must provide clients with effective interventions. Social workers must be accountable for their work with clients (Gambrill 2013). Without tracking outcomes, clinicians have no means of evaluating whether the intervention is working and whether the client is moving towards meeting their goals.

A final reason to use EBP is that whether one likes it or not, incorporating a sound decision-making process that includes, but is not limited to research evidence, we increase our credibility with other professionals and payers. Other professionals have ranked social workers has having little to no evidence for their intervention approaches with clients, especially in comparison to other professionals (Murphy and McDonald 2004). Of course pleasing other professionals should not be a driving force behind adopting EBP. EBP does, however, have the potential to demonstrate the breadth and depth of social workers’ capacities, including our empirical knowledge, our clinical assessment skills, and our capacities to incorporate contextual factors into our work with clients. All of these aspects of EBP are consistent with our profession’s values. Therefore, EBP in many ways provides a means to demonstrate our unique strengths when working with other professionals. As a profession, social workers are holistic and infuse the bio-psycho-social-spiritual perspective throughout our assessment and intervention processes. Social workers are trained to integrate multiple sources of information and knowledge into our work. EBP, correctly understood and fully applied, is a vehicle through which we can highlight the uniqueness and strengths of our profession.

Recommendations

We have some specific recommendations regarding how to help practitioners and the profession can more effectively integrate EBP into social work.

Social Work Education

One of the primary ways that the profession can more effectively infuse the correct definition and use of EBP is to teach EBP in all curriculum sequences within social work programs (i.e. practice, field, research, policy, human behavior in the social environment). The EBP process integrates several forms of knowledge and requires critical thinking. Therefore, it fits well within all areas of the social work curriculum. Within courses, social work academics should clarify what EBP is, and what it is not. Students should be asked to practice each of the steps of EBP, preferably using their own cases or using standardized training cases. By practicing the EBP process, they will become more familiar with the available research sources, gain confidence in the ability to evaluate and critique the available research, have informed discussions about the state of practice research and in turn possibly become more motivated to become producers of research. In addition, by practicing with their own cases, students will understand the strengths and limitations of EBP in the real world. They will also actively include clients in decision making which supports both clients’ rights and dignity. When EBP is integrated into the curriculum, students may see clearly how the various components of the MSW program curriculum are connected to each other.

For example, in the first step of EBP students will learn the importance of a solid and thorough assessment. They must have strong assessment skills to assess clients’ needs and develop a searchable question. This assessment should involve their knowledge of human behavior, issues related to diverse identities, and the person-in-environment perspective (PIE). Both EBP and the PIE perspectives emphasize the importance of context. As such, students would need to consider not just the immediate or micro context of the client in EBP, but also explore relevant policies that might influence someone’s access to potential interventions (i.e. someone’s criminal background or immigration status). In addition, as they begin to think about a searchable question, they may identify gaps in their own clinical skills they can then seek to address in their training.

In the second and third steps of EBP, students need to understand how to find, critique, and interpret relevant research, pulling in another area of the curriculum. Social work educators that teach research courses should include meta-analyses and systematic reviews in their courses. Educators should more fully integrate practice relevant research into research courses. EBP advocates for the use of the best available evidence. Yet quite often the best available evidence is not a randomized control experimental study. Often what is located are case studies or quasi-experimental studies. Students need to be able to read and understand a range of research articles to be effective consumers of the available research.

In the fourth step of EBP, students must engage actively with clients. In these conversations, they need to use a variety of clinical skills, such as clarifying, interpreting, and reframing. Students will need to track the clients’ affect and any noticeable changes in the clients’ behaviors and then adjust accordingly. They will need to learn how to build a collaborative partnership with their clients as they then plan for the chosen intervention along with their clients in the fifth step. This fifth step also emphasizes the ethical principle highlighted within the NASW Code of Ethics that we should practice within our areas of competence and expertise (Gambrill 2013; NASW 2008). EBP emphasizes transparency regarding what is available in the research (Gambrill 2006). If a client chooses an EST that is outside of the expertise of the social worker to deliver, an ethical and practical dilemma ensues (Drisko and Grady 2012). In this way, students can be exposed to a critical aspect of ethical practice in doing EBP. Through such discussions, ethics are easily included in course content. Finally, EBP involves critical thinking skills through each step of the process, a skill that we all hope social workers utilize throughout their careers.

By including EBP in every aspect of the curriculum, students will gain more confidence in their ability to use it. EBP will become an essential and useful component of their practice. However, the classroom cannot be the only place where EBP is taught and discussed. Practicum supervisors and field liaisons must reinforce the use of EBP in the field. In our opinion, social work skills and knowledge are always strengthened when students are able to integrate their classroom and field experiences. Therefore, field education administrators must provide thorough and on-going training to practicum supervisors. In addition, doing EBP should be included as part of written field assignments and learning contracts between the student and the field agency.

Social Work Research

One of the critiques that social workers have cited about EBP is the limited kinds of research that are currently available or valued in EBP. While EBP emphasizes using ‘the best available evidence,’ what is often summarized is solely experimental research. Experiments can document cause-effect relationships, but only if they are adequately conceptualized and implemented fully and correctly. Other types of research can also provide useful knowledge for practice (Drisko and Grady 2012; Gilgun 2005). Non-experimental research is also crucial to establishing that experiments are well conceptualized and to identifying threats to internal and external validity fully. Some social work scholars have begun to advocate for the academy to re-value other forms of research, such as quasi-experimental, qualitative, and case studies (Rubin 2014). These various forms of research allow the inclusion of different forms of knowledge to be included in EBP. In turn, social work education must address a wide range of practice research methods and issues.

Practitioners argue that the available research is too restrictive in scope and practice relevance. That is, studies often include a small, very narrowly defined group of participants, often in artificial settings (Bellamy et al. 2008; Nelson et al. 2006). Social workers work with multiple, complex clients and in varied settings. As such, we recommend that social work research should include studies done in ‘real-world’ settings with ‘real-world’ clients (Nelson et al. 2006; Rubin 2014). Academic journals should encourage scholars to submit manuscripts that emphasize how ESTs were adopted and modified in the real world so that practitioners can learn from those experiences (Drisko and Grady 2012; Rubin 2014). In doing so, practitioners will be exposed to creative ways in which ESTs can be used in different contexts with diverse populations.

Continuing Educational Challenges

Educators simply omit, or teach minimally, several technical aspects of the EBP literature in the typical BSW and MSWs curriculum. Systematic reviews of the research (SRs) literature based on meta-analytic statistics that are rarely taught in typical BSW and MSW programs. SRs synthesize the results of multiple studies, combining careful analysis, critical thinking, and technical knowledge. The results of SRs are frequently reported using statistics such as number needed to treat (NNT), odds ratios (OR) and relative risks (RR). These statistics are not found in contemporary social work statistics texts (Drisko and Grady 2012).

Where only one study, or a number of studies based on different research designs are located in an EBP literature search, important critical judgements are needed to identify the best available research evidence. While medical ‘hierarchies of research evidence’ (Oxford Centre for Evidence-based Medicine 2011) exist, they devalue qualitative research results and case studies, which may be the ‘best’ available evidence for some client’s needs. Methodologies to synthesize multiple qualitative research studies need additional refinement and discussion.

Challenges for Social Work Practice

Integrating Client Values and Preferences into EBP

Gilgun (2005) notes that client values and preferences are not well defined in EBP models, nor is how to include them in the process well explained. Prioritizing client values and preferences fits with contemporary models of patient centered practice (United States Agency for Health Care Research and Quality 2002). Gambrill (2001) argues that highlighting client choice diminishes the potentially authoritarian stance clinicians may take with clients. Further, it is a step that distinguishes EBP from traditional medical models, and hierarchical ‘expert’ approaches. EBP requires that clinicians explain complex problems and treatments to clients, but this is a collaborative dialogue, not a one-time presentation of facts. Towle and Godolphin (1999) state that current legal and ethical standards require informed, shared decision making in health care decision making.

Integrating Professional Expertise into EBP

Similarly, Gilgun (2005) states that the EBP literature does not define professional expertise fully, nor does it elaborate how expertise shapes the process. Professional expertise of several kinds is required in engaging clients, in completing a thorough assessment, in locating the best research evidence and in collaboratively discussing treatment options with clients. Yet it is unclear how much latitude professionals have in adapting and individualizing manualized treatments or programs. Such changes arguably lead to a new and untested form of treatment. Further, it is unclear how professionals should prioritize social factors in combination with mental health concerns. For example, a homeless client may not be able to enroll in a treatment program because of their homeless status. If the client seeks help with anger management—which may complicate obtaining housing—should the professional shift focus to obtaining housing as a step toward treatment? Would mental health agencies always be able to provide such comprehensive services? Even if approved, how would such services be funded? The purpose, focus and limits to the application of professional expertise in EBP requires much greater discussion and elaboration.

Clinical social workers need to be clear that EBP, as a process, includes client values and preferences, as well as clinical expertise. Clinical social workers need to forcefully educate administrators and payers about what EBP actually is, and demand from them attention to all its parts. Clinical social workers should formally and carefully document instances when administrators and payers obligate them to take actions that are not consistent with the actual definition of EBP. This is consistent both with excellence in practice and with social work’s professional values. Yet clinical social workers should consider, and collaboratively discuss with clients, how ESTs may effectively serve their needs. Where clients do not chose to undertake ESTs, this too should be carefully and formally documented in the client’s record. Social work’s professional organizations should also actively educate members and the public about the actual definition of EBP. Client choice, the role of clinical expertise, and payers reasonable concerns about limiting health care costs, all warrant ongoing attention and advocacy by professional organizations.

Drisko and Grady (2012) offer a set of case studies showing the strengths and challenges to doing EBP in clinical social work. They find that the EBP model can be very useful, but use the cases to detail the complexity of doing it in every day practice settings with complex client needs and situations. These case examples help fill out the complexity of doing EBP in real world settings.

Conclusion

Evidence-based practice has the potential to highlight social work’s strengths, namely our capacity to integrate sources of knowledge in our holistic and client-centered approach to clinical practice. Although there are numerous challenges to infusing EBP into practice settings, social work programs can address some of these challenges by helping the next generation of social workers be more versed and confident in their capacity to use EBP. In addition, administrators, supervisors, and others can help create a culture of EBP that normalizes its inclusion, clarifies its definition, and maintains the central role clinicians play in each step of the process. By infusing EBP into every day social work practice, clinicians, agencies, and clients will all benefit from the integrated knowledge and intentional approach that EBP brings to practice.