FormalPara Learning Goals

After reading this chapter you should be able to:

  • Explore a range of routes into research and identify which routes to pursue;

  • Explain the difference between evidence-based practice and practice-based evidence;

  • Understand the difference between efficacy and effectiveness and how to reach an equilibrium that suits your personal and professional style;

  • Identify the impact research can have on practice and how research can be applied to daily practice;

  • Practise refining and developing a shared language between researchers and practitioners in daily practice.

Introduction

Evidence-Based Practice (EBP) and Practice-Based Evidence (PBE)

Throughout this book, the issue of research-supported practice has been an underlying theme. The authors have looked at ways in which we might navigate issues around research and practice by considering the different routes into research. In this final chapter, we will focus on building an evidence base born out of clinical practice. We offer a synopsis of evidence-based practice (EBP) and practice-based evidence (PBE) and consider the sliding scale of efficacy and effectiveness studies, whilst drawing attention to how we might reach an equilibrium to suit where we are in our own personal and professional development. We encourage you to consider the internal and external impact that research could have on your daily practice and the ways in which we might adopt a shared language that translates research and practice.

Reflections and activities have been provided throughout the chapter as well as brief case studies to help you apply the theory to your own practice. Reflecting on our practice and understanding how or why we might be interested in certain aspects of practice have been themes throughout all chapters in this book. This is a skill that is central to doing research. As Skovholt & Trotter-Mathison (2014) suggests: “As important as methods may be, the most practical thing we can achieve in any kind of work is insight into what is happening inside us as we do it. The more familiar we are with our inner terrain, the more sure footed our [work] – and living- becomes” (p.17).

In preparation for writing this chapter, we not only drew on our own personal experience of working together and with colleagues, but we also met with a number of trainees and practitioners to consult as much as we could along the way. Further to this, we have drawn on a range of literature that has been highlighted throughout this chapter and in the recommended reading at the end. Invaluable to or own understanding has been reading around the subject, and we would like to draw your attention to a number of key texts, including: 1) Barkham et al. (2010) who provide a guide for delivering practice-based evidence and 2) Bager-Charleson, du Plock & McBeath (2018) who explore practitioners’ views on psychotherapy practice and research. We would highly recommend that you read these key texts in conjunction with this chapter.

Different Routes into Research

Navigating issues around research, bridging the gap between research and practice, and accessing or doing research are some of the many challenges you may face as a practitioner. We may have a natural preference to use research methods that are driven by practice or align with efficacy or effectiveness studies. You may already have connected with your preferences after reading this book. With each new research challenge is an opportunity to reflect, reassess and adapt for the benefit of our personal and professional development. These opportunities can arise at any stage during your professional career, and they may do so irrespective of whether you are embarking on research for the first time or are an experienced practitioner who has engaged with or led research in the past.

Sitting with our Clients

When you sit with your clients and seek to understand their frame of reference or presenting issues, consider how you are gaining more understanding and the skills you employ. Research starts in this sense quite simply in our daily practice. You listen to the words the client is using. You listen to the tone and pitch of their voice, looking at the client’s body language and facial expressions, and you may use your sense of smell or your felt sense of the client. You may also enquire about what has brought them to your consulting room, what their history is and what their symptoms are. Through this process of enquiry, we collect a significant amount of data that we constantly sift and analyse.

‘Analysing’ our Daily Data

Supervision is also a formal structure within which we sift and analyse data. When we present at supervision, we may, for example, talk about our felt sense of the client, how we want to better understand why we may be losing our empathy for a particular client. We may even be reporting that we are experiencing a similar feeling with all our clients. Through a process of data analysis with my supervisor, I can gain better insight and understanding about my client work. I can begin to see and notice patterns and themes in the client work, and from this I can begin to formulate hypotheses about my clinical work. Whether we practise individually in independent practice or work within an organisation, when we are at the point of forming a hypothesis about a client, we may choose to give further discipline and structure to our experiences, hunches and hypothesis and engage in some structured practice-based research. Therefore, the research we undertake comes directly from our practice and the conclusions of the research will inform that same practice. We have broken this down into three main areas: personal, intrapersonal and contextual (see ◘ Fig. 14.1).

Fig. 14.1
An illustration of three blocks titled personal, intrapersonal, and contextual depicts reasons for research.

Reasons for research

Personal

Your focus is on increasing your understanding about yourself and your internal world. This route is feeling orientated, where you are the researcher and place yourself in the frame with the aim of increasing your self-awareness and self-exploration. For example, we may be interested in our own counter-transference response to clients and how our own experience of personal therapy has shaped us as a practitioner.

Intrapersonal

This route into research is more about how you relate to others, and thus the focus is on the interactions between the researcher (you) and the other–our clients, our supervisees, our colleagues. Taking this route requires us to not only be aware of our own feelings and focus, but to also understand and harness what is happening between oneself and others. In essence, a focus on meta thinking and communications. For example, we may be interested in the transferential relationship or the parallel process in supervision.

Contextual

The third route into research is where the researcher is interested in shedding a light onto relationships between people and the context within which they sit. This best suits the researcher who wants to focus on how, for example, organisational culture impacts the client and client outcomes.

Navigating the Landscape of Evidence: Evidence-Based Practice or Practice-Based Evidence?

Is it possible to engage with quality psychotherapy research and maintain a full client caseload? Practitioners bring a wealth of knowledge from their therapeutic work and add endless value to the relevance of research. Engaging with research activities leads to both internal benefits for practitioners’ development and external benefits for the service and wider field of psychotherapy (Bartholomew et al. 2017). Therapists who engage with data collection in their service or embed feedback into their daily practice feel more able to use evidence to inform the way they work and report positive client outcomes (Castonguay et al. 2010). Therapists who contribute to research activities increase the relevance of research by ensuring that the factors being explored are closely aligned with therapeutic practice (Youn et al. 2018). This, in turn, builds an evidence base on the effectiveness of therapy and provides recommendations for further service development. There are also service benefits from engaging with research and translating research findings into practice (see also Glasgow, Lichtenstein & Marcus 2003). For example, you may learn from the demographic information you collect that certain clients do not approach your service and you may redesign an element of your service to be more accessible and accommodate diverse demographics. This inquisitive and adaptive response to research findings not only ensures that you continue to develop your practice and offer a service that best fits your clients, but it also triggers a ripple effect for the wider sector.

The more practitioners engage with research, the closer we get to an evidence base that is grounded in clinical work and informed by therapists on the ground. The sector and policy decisions become more informed by the needs of clients and the profession gains the evidence to shape and protect the workforce. Perhaps the more relevant observation is that all therapists are engaging in research as part of their therapeutic offer and the theoretical base of our approach comes from years of active research (see also Rowland & Goss, 2013). We may vary on the different theories of counselling and psychotherapy, but we all employ research skills to confirm or discount theories. So perhaps the more fitting question is how to rather than why do research.

The Landscape of Evidence-Based Practice (EBP)

Evidence based-practice (EBP) informs standard health and psychological health care and is concerned with ‘big questions’, such as Does psychotherapy work? EBP often relies on treatment manuals and protocols with strict inclusion criteria to create homogenous client groups. By doing so, EBP favours certain types of evidence and drives the assertion that research is rigorous. EBP is overall concerned with highly controlled trials (i.e. Randomised Controlled Trials) that are often conducted across several sites with rigorous protocols and procedures to follow. Thus, EBP will attempt to control as many factors as possible in order to reduce the likelihood of a result being due to chance and provide confidence that the subsequent client improvement was the result of a therapeutic impact. These design characteristics overlap with efficacy studies, which explore outcomes under ideal or ‘optimum’ circumstances. EBP levies a hierarchy on trustworthiness with other types of evidence when higher methodological standards are not available, and as such, certain approaches to research can feel alien and threatening to practitioners.

Understanding EBP

However, it is important that we understand EBP and how it will continue to shape our profession. This way we can begin to engage and have a voice in some of the wider debates about our profession. There are also many benefits of EBP in the field of psychotherapy research as it can strengthen the reputation of the profession and foster relationships between researchers and practitioners (Allan, 2019). The bridge between research and practice is vast, with the high-end empirical evidence having little effect on what practitioners do on the ground, but we can apply the learnings from such research and adapt methods to be embedded into practice. Achieving this has potential to add credibility to your own professional reputation, the service or organisation you work in, and the overall field of psychological therapy.

The Landscape of Practice-Based Evidence (PBE)

As practitioners, we can feel more at ease and at home with PBE. PBE challenges the notion of a hierarchy and starts from a more level playing field. PBE adopts a ‘bottom-up’ approach that is firmly rooted in practice and shaped by the needs of the service. PBE is concerned with everyday practice, and its primary focus is generating research questions that are wholly grounded in practice and the routine context in which we practise. For example, we may be interested in using a new outcome measure with our clients and then discussing feedback with our client in sessions over time. We may already know the measure we want to use and have had training on how to use it, but we may not have the budget to use the form electronically and may not have administrative support to give clients the form. How do you find time for your client to complete the form without taking time away from the therapy session? What happens if a client arrives late or is too distressed to complete the form? Do you have the capacity to use the measure with every client? Rather than trying to tackle every challenge at once, you might decide to start using the form with a few clients until you find your new routine and feel ready to use the measure with all clients. You could ask your client to arrive 10 minutes early for their session and ask if they would be willing to complete the measure so that you can discuss it in your session. Over time you may observe that your clients naturally arrive early for their session as it becomes part of the way you work together. Alternatively, you may find that these changes do not work for you at first. You may decide to ask a colleague how they introduced a new measure into their practice, or you may raise it with your supervisor to find another solution.

The development of Practice Based Research Networks (PBRN) has enhanced the field of PBE even further by linking together groups of practitioners to collaborate, share good practice examples, conduct shared research projects and even pool service data (Zarin et al. 1997). The real strength of PBE is how encompassing it is in its application and use of a variety of different methods. Barkham and Mellor-Clark (2013) suggest that PBE is pluralistic when it comes to research methods, meaning that there is respect and value in the range of different approaches and methodologies. Its accessible application can, however, be its greatest weakness in that it can, in the eyes of some, lack rigour and solid empirical evidence. PBE can be revolutionary in two ways. First, PBE is building a sound evidence base that is born out of clinical practice and as such is driven by our clients and places clients at the very heart of research findings. Second, given its accessible nature and relevance to practice, more and more practitioners are engaging in, conducting and contributing to research evidence. The contrast between practice-based evidence and evidence-practice can at first seem quite stark, but it’s perhaps more helpful to view the methods as being on a continuum, and depending on your experience and research intentions you may align more closely with one at a given time.

When we asked clinical students and newly qualified therapists to reflect on what gets in the way of doing research, here’s what they said:

I’m relatively new in my counselling career and I’m working the max hours that I can for the safety of my clients, but still only making just enough money to support myself. I’m very aware that I have to be careful with what I do extra. I really hate that I’m at the point where I have to think about that. So, having some time protected within my role to do research would make a huge difference.

I think confidence. Would I be skilful enough to do research? Would my blind sidedness be too great for research that is valid and relevant? Am I going to drown before I get to the other side? I’ve started to realise that I don’t have to go at it alone and that it might be a good idea to get involved with an ongoing project or maybe even help if the right study came along. I’ve found a couple of local events, workshop type things and a relevant conference so I hope to meet other people interested in research there. I’ve also asked my supervisor who’s also an academic to see if they know of anything I could do.

These experiences were echoed among the counsellors we consulted, and they are not alone. Bartholomew et al. (2017) identified core factors that surround therapists’ involvement in research, including ‘making research feasible’, ‘the impediments to psychotherapy research’ and ‘benefits of doing research’. We’ve responded to these factors and applied both our own experience and the experience of colleagues and therapists we consulted to highlight the opportunity born from each challenge. ◘ Table 14.1 summarises the key themes with strategies to adapt the findings to your own personal and professional development.

Table 14.1 Summary of challenges and opportunities surrounding therapists engaging with research

The Sliding Scale of Efficacy-Effectiveness

Research methods can be used to evaluate interventions to determine whether they work and achieve the outcomes they set out to deliver. The methods we use to identify and reach these outcomes can be placed on a sliding scale from research that is conducted under optimum conditions and is often highly controlled–efficacy studies–moving to research that is applied, usually pragmatic and embedded in the ‘real world’–effectiveness studies (◘ Fig.14.2). Earlier in this chapter we explored the landscape of EBP and noted that this type of research aims to answer big questions that drive the future of health care and policy decisions, often relying on RCTs and using methods that attempt to control many components of the research (See also Kim, 2013). Systematic reviews are another example of research that applies methodical rules to combine study outcomes, often from RTCs, but also from broader research designs such as the systematic scoping review commissioned by the British Association for Counselling and Psychotherapy to compile evidence on counselling in children and young people (see Pattison and Harris 2006).

Fig. 14.2
A tabular representation displays columns with the title evidence-based practice, experience and development, and practice-based evidence with their respective points.

Navigating the landscape of evidence-based practice and practice-based evidence

A key advantage of efficacy studies is that their controlled and manualised nature allows studies to be replicated, the findings of which can be pooled to provide a sound evidence base for the sector. For example, it was with these methods that Barth et al. (2016) were able to conclude that psychotherapeutic interventions for depression in adults are superior to receiving no treatment and that different psychotherapies have comparable benefits. A simple conclusion, but one derived from critically evaluating and extracting the findings from 198 RCTs including 15,118 adults receiving one of seven psychotherapies. However, such findings from efficacy studies, as essential as they are to evidence the profession, do not necessarily apply directly to practice, and the outcomes from efficacy studies are harder to achieve or may vary in practice (see Glasgow, Lichtenstein & Marcus 2003). Effectiveness studies are more liberal than efficacy studies and are typically less controlled with fewer methodological restraints. These characteristics go hand in hand with practice-based evidence and as such can seem more welcoming to practitioners. For example, whereas efficacy studies identify outcomes from an intervention in an ideal environment and then seek to replicate findings in a natural environment, effectiveness studies tend to start with the natural environment and will shape research methods around the intervention. Effectiveness studies still attempt to use rigorous research methods but will do so without dramatically changing the natural environment (i.e. a service). An example of this would be embedding a trial into a counselling service and rather than protecting counselling sessions so that clients in the trial have priority over other clients, all clients are scheduled to see a counsellor when they are available, as they would in routine practice (Broglia et al. 2017 and 2019). Such design components are not only more pragmatic and accessible, but they also address some of the limitations of efficacy studies that are often more difficult to replicate in routine practice. Some have also argued that effectiveness studies are more inclusive and representative than efficacy studies and as such are better able to respond to social justice issues (Allan, 2019). The strict inclusion criteria of efficacy studies and the need for large samples make it difficult to include clients that are less represented in therapeutic practice regarding characteristics that concern race, age, gender, religion, sexuality and disability. Aside from the limited client demographic, research inclusion criteria may also overrepresent clients of a certain clinical severity such as clients that meet a mild or moderate clinical threshold on a routine outcome measure. Similar sampling issues and the transient nature of certain client groups create further difficulty for researching more complex clients. These are common challenges of designing any research in the field of counselling and psychotherapy, and whilst it is not always possible to control for every extenuating factor, there is inevitably more variability (and therefore uncertainty) introduced when research is less controlled. It is for these reasons that perhaps it is helpful to view each research design decision as falling on a sliding scale that moves between efficacy and effectiveness studies, and some factors may be more feasible to control than others. ◘ Figure 14.3 presents this sliding scale and highlights that your research intentions will influence whether you adopt methods from efficacy or effectiveness studies.

Fig. 14.3
A tabular representation displays the columns titled efficacy context and effectiveness. It depicts the sliding scale and highlights the respective points.

The Efficacy—Effectiveness continuum

Working with a Shared Language

We can take language for granted, and when we gain our own expertise we automatically make a set of assumptions regarding the level of knowledge of others. These potential language barriers aren’t unique to the realms of research and practice–there are examples of different language use and assumptions being made between further and higher education institutions, psychologists and sociologists, and quantitative and qualitative researchers. Adapting the language we use takes practice and patience, and if either is lacking then it can add a further layer of confusion. It’s helpful to bear this in mind when you’re choosing your own language and to be mindful of the types of assumptions you might make before entering the conversation.

Internal and External Impact

Following the argument that the researching practitioner will inevitably place themselves within the research, it is important to consider the impact of any research on yourself as the researcher, your institution or organisation if relevant, and the participants–the clients. Whichever is your preferred route into research, as a practitioner you almost inevitably will be revealing a lot about you, your feelings, your clinical approach, your judgements and your views and values.

A precursor to undertaking this type of research is for you to consider whether you have the right level of support both professionally and personally to tolerate this level of exposure. Your findings may also challenge strongly held views within the profession, by your colleagues and your institution. As a practitioner you are very close to these groupings and have to be able to continue to work professionally after any findings are in the public domain. A critical way of sustaining yourself as a practitioner engaged in research is to establish a trusted and supportive relationship between yourself and the professional researcher where the research and research methods are built on co-design and co-authoring. Through this honest and authentic collaboration, your mutual skills and experience of practice and research will merge–then you will truly be involved in research in action.

Summary

This chapter encourages you to address some of the challenges and opportunities practitioners experience when they engage with research. We hope to have encouraged you to build an evidence base born out of clinical practice. We need practitioners, such as yourself, to continue to engage with research and question your daily practice. In this chapter we presented some ways in which you might navigate issues around research and practice by considering the different routes into research. We referred to the broad remit of evidence-based practice and practice-based evidence and presented a sliding scale of efficacy and effectiveness, ideally with an equilibrium to suit where you are in your own personal and professional development. Our priorities for engaging with research will naturally vary over time. We explored examples of how to translate theory into practice and encouraged you to consider the impact that research could have on your daily practice. It is hoped that activities throughout the chapter provided an opportunity to reflect on your practice and understand how or why you might be interested in certain aspects of practice, and to recognise that such skills are central to doing research.

Throughout the book, the different authors have tried to convey the fundamentals of research and practice, woven into wider contextual aspects. We hope that this final chapter has contributed further to demystifying some of the thinking around evidence, whilst contributing to a basic map that will assist and support you further when navigating and pioneering your own research and practice. The most valuable asset to us in conducting any research has been relationships: our relationships with each other, with colleagues and with peers, as well as our relationship with ourselves. Our own personal insight has and continues to be invaluable to us as both researcher and practitioner–it is our common ground. From this common ground we can begin to explore our different contexts whilst remaining open to the empirical knowledge available to us. We hope that this chapter together with the others has triggered ideas and provided you with inspiration to enjoy many research projects, and that they have helped to build much needed knowledge in the fields of mental health and emotional wellbeing.