Introduction

The health care systems in many high income countries will face substantial challenges in the near future: An increasing population of old people, a boost in the need for health care services, and fewer people to provide and finance the services. Welfare technology (WT) is launched as one important means to meet this challenge: WT can free resources, provide help to those who are in need, reduce costs, and be an area of research, development, and innovation.

Whether WT can meet these demands and avoid unintended side effects is a key question. Unfortunately there are very few studies documenting the outcome from WTs. Besides, although WTs do not necessarily differ from other traditional technologies, they differ from traditional health care technologies. They are used in different arenas (e.g. at home), by other people (by patients, relatives or new health professions), for certain groups (e.g. old people and/or the physically or mentally impaired), for different purposes (e.g. social stimulation), and outside traditional health care organizations. This raises a series of ethical questions: do we need WTs? Are they good or bad? What does it take to obtain a desired outcome?

As questions of welfare are issues of “the good life” and as “the good life” is a traditional topic in ethics, it comes as no surprise that WT is of interest to ethics. Whether we should introduce advanced medical monitoring and testing in private homes, use tracking technologies for people with cognitive impairments (e.g. people with dementia), and whether to use robot pets to stimulate lonely, old or cognitively impaired people are more than mere choices of technology and technical solutions.

How to Identify Welfare Technologies and Related Ethical Challenges?

In order to identify WTs and potential ethical challenges a literature search in Ovid MEDLINE, Ovid EMBASE, Ovid PsycINFO, Cochrane Systematic Reviews, SveMed, DARE, HTA-database and NHS Economic Evaluations was performed (March 11 2010) and 1,976 references were identified, where 1,185 references explicitly mentioned ethical issues. Limiting to systematic reviews identified 731 references. Figure 1 shows how the references from the literature searches were handled. Details on the searches are described elsewhere (Hofmann 2010).

Fig. 1
figure 1

Results from literature search

What is Welfare Technology?

The literature search identified a vast variety of technologies. The types of technologies are grouped in Table 1 according to their purpose and function.

Table 1 Welfare technology classified according to purpose and function

The table is not complete and the categories are neither exhaustive nor exclusive. However, the point is to give an overview of the types of technologies that are available in order to highlight the ethical challenges. WT is a heterogeneous group of technologies, and the ethical challenges will vary with the kind and use of each technology. To assess the ethical challenges of every kind of WT is beyond the scope of this review. The point here is instead to highlight some key challenges of an emerging class of technology. Hopefully, this review can be a useful point of departure for detailed ethical analyses of particular kinds of new WT, and for addressing important challenges when developing and implementing WT.

How to Identify Ethical Challenges of Welfare Technologies?

Ethical challenges of WT were identified with a (Socratic) question based method developed for addressing ethical challenges in Health Technology Assessment (HTA). The approach has been used for a wide range of technologies and is described in detail elsewhere (Hofmann 2005a, b, 2008). The point with this approach is to highlight the ethical challenges that appear relevant for decision making in an open and transparent manner and not to provide explicit recommendations or deliver rigid conclusions.

What is the Utility of Welfare Technology?

The rapidly growing literature on WT clearly shows that it has a moral end: WT is supposed to give better and more focused care, reduced risk and increased safety, increased coping and self-determination, make it possible to stay at home longer, avoid harm (from falling, fire, robbery), make more just resource allocation, and to promote technology development, commercialization and growth. Some, but not all of these intentions have been operationalized in specific endpoints or have resulted in documented outcomes.

The literature review identifies some examples of WT utility. E.g. mobility technologies significantly can increase the flexibility, agility, and movability, such as wheelchairs that can pass staircases (Amin 2004). It has also been illustrated how compensatory technologies can enhance a person’s independence, autonomy, and dignity (Milner 1995). When you can do as others, limitations and difference become less apparent.

Internet-based psychotherapy has been documented to be effective (Barak et al. 2008). The same has been documented for certain types of telemedicine for home services (Bensink et al. 2007; Gaikwad and Warren 2009). Distance monitoring of heart failure has been shown to reduce mortality and to some extent rehospitalisation (Clark et al. 2007; Louis et al. 2003; Maric et al. 2009; Martinez et al. 2006), while the monitoring of HbA1-c has resulted in better glycated haemoglobin levels (Medical Advisory Secretariat 2009). Systems for drug handling can improve safety (Ammenwerth et al. 2008). WT can also prevent harm, e.g. from fall accidents (Gillespie et al. 2009) and fall detectors can make people less anxious and less afraid of falling (Brownsell and Hawley 2004). Correspondingly, accesses control technology can make people with dementia feel safer (Margot-Cattin and Nygard 2006). Communication support and compensation technologies make it possible to provide better care services (Jones and Brennan 2002) and can have autonomy promoting effects (Castillo 2005; Nicolas et al. 2005; Pare et al. 2007) because it can improve decision making. People with sensory impairment can increase their welfare by sensory compensating technologies, e.g. deaf people can benefit from video communication (Center for Medical Technology 2005). Technology for communication support may also increase social contact and reduce depression (Griffiths et al. 2009).

Home based hospital services may speed healing processes (Dinesen et al. 2008), be conceived of as supportive and lifesaving (Earle et al. 2006), and may be more effective than hospital services (Mowatt et al. 2003). Additionally, it provides an opportunity for dignified treatment at home (Pannuti and Tanneberger 1998a, b).

Aids for the elderly increase mobility and enable activities that would otherwise not be possible (Haggblom-Kronlof and Sonn 2007). WT may also enable people with various kinds of disabilities to live more like non-disabled (Jutai et al. 2000; Hansson 2007). People with dementia may improve their self-confidence from non-advanced information technology that compensates for impaired memory, they may benefit from communication technology for social interaction (cell phone), at the same time as tracking technology has been documented to increase safety and reduce fear and insecurity (Lauriks et al. 2007). Various forms of technology to increase safety and perform specific tasks in people’s home (smart house technology) may increase security and communication with the outside world (Chan et al. 2008; Chapman 2001; Gentry 2009) and it is welcomed by the elderly themselves (Demiris 2008; Demiris et al. 2008).

The literature review shows that there are many articles describing beneficial outcomes of WT, but only few empirical studies can document such outcomes. Moreover, most studies are small and of medium or poor quality. Hence, it is difficult to assess the utility of most WTs.

Useful for Whom?

The assessment of the outcome of WT is also challenging because it can be useful from a wide range of perspectives: is WT useful for the health care provider, for the client (patient or user) (Botsis and Hartvigsen 2008), for relatives, for the industry, or for society at large? Some applications of WT turn out to be more useful for the health care provider and for society than for the “user” (Cash 2003). Tracking technology is but one example. Communication technology aiming at reducing the number of visits to people’s home is another.

Correspondingly, the choice of end point is challenging: should we strive for increased survival, reduced morbidity (e.g. number of diagnoses, admissions, medication), (para)clinical measures, or for reduced vulnerability, increased function and coping, independence or quality of life? Diabetes surveillance is but one example, where better control with HbA1-c levels is documented (Medical_Advisory_Secretariat 2009), but it is far from obvious that this reduces morbidity and mortality or increases quality of life (Wieczorek et al. 2008; Birren et al. 1991).

Relatives may play an important role in specifying and assessing WT, e.g. in the case of people with cognitive impairments (Rialle et al. 2008), but even here the balancing between the person’s interests and other stakeholders’ interests may be challenging.

Just Distribution of Welfare Technology

Communication technology may reduce differences in access to health care services. At the same time technology may be discriminating (Department of Justice 1991) and enhance differences and inequalities (Demiris et al. 2006; Goodwin et al. 2007; Perry et al. 2009). Also with WT there are documented biases in the access to technology (Baker and Moon 2008; Gatward 2004). For some WTs (e.g. tele-nursing) gender differences are documented (Hoglund and Holmstrom 2008). Although it may be possible to avoid the effects of “the digital divide,” differences in WT quality may create and enhance other inequalities (Bauer 2003).

If WT alters the conditions of aging it may challenge traditional principles of prioritization (Farrant 2009), and WT may be subject to age discrimination (Mott 1990). Challenges with prioritization may also become pressing in cases where WT imposes significant burdens on relatives (Levine 2005). Is it fair to expect relatives to take on extra burdens related to extended technology use?

Risk and Harm: Big Brother Sees and Helps You

Technology for tracking, disease monitoring, as well as technology for distance treatment raises basic challenges with surveillance, autonomy, confidentiality, and privacy. Health professionals are aware of such issues (Miskelly 2004), and patients and technical personnel are concerned with safety issues (Dorsten et al. 2009; Doyle 2007; McAward 2005; McQuaid 2007). On the other hand, tracking technology may reduce other and more restricting alternatives, such as physical restrictions. One alternative is so-called subjective barriers, such as labelling and mirror doors without knobs, RF-coded door openers etc., but the utility of such measures is unclear (Price et al. 2001), and the balancing between risk and benefit is challenging (Robinson et al. 2007).

In particular, several studies point out how tracking technology raises special issues with confidentiality and privacy (Anderson and Labay 2006; Bharucha et al. 2006; Cochran et al. 2007; Foster and Jaeger 2008; Hagen 2007; Levine et al. 2007; McShane et al. 1994; Niemeijer and Hertogh 2008; Plastow 2006), as well as with dignity (Hughes et al. 2008b; Welsh et al. 2003). At the same time as tracking technologies have great potential utility, it appears to be important to consider who will gain: is it the person being tracked (Hughes et al. 2008a), is it health care personnel, or is it the relatives feeling safer (Bail et al. 2003; Cahill 2003)? Tracking technology raises the question of who will decide whether and how it can be used, as well as whether it can be forced on people (Foster and Jaeger 2008).

The risks and disadvantages with medical monitoring may be difficult to balance against the benefits. Diabetes care may serve as an example. Contiguous monitoring of physiological and biochemical parameters may increase coping and self-care (Anderson et al. 2007; Farmer et al. 2005), but may also be conceived of as control and promote distrust (Anderson and Funnell 2005; Gammon et al. 2009), as well as embodying a moral assessment of a person’s (lack) of self-control (Hilden 2002). Tele-cardiology is another example where it is possible not only to monitor implanted devices, such as automatic defibrillators, but also a series of other parameters related to a person’s activity and conduct (Boriani et al. 2008; Celler et al. 1995; Chaudhry et al. 2007).

This furthers questions of surveillance and control (de Bruin et al. 2008; Le et al. 2008). E.g. a change in a person’s activity can give an alert that the person reduces his or her autonomy and control (Le et al. 2008). At the same time, surveillance of activity raises questions of defining “normal activity,” and who is to decide. Important target groups express concerns for privacy rights at camera based surveillance (Demiris et al. 2009). Another relevant question is whether persons without capacity to consent have the right to neglect their health.

Health information systems, telemedicine systems, and home hospital systems raise corresponding challenges with regard to confidentiality and privacy (Demiris et al. 2006; Dinesen et al. 2008; Dorsten et al. 2009; Goins et al. 2001; Levy and Strombeck 2002; Farrant 2009; Magnusson and Hanson 2003; Mohan and Razali Raja 2004; Nymark 2007; Pharow and Blobel 2008; Pugno 2002; Savastano et al. 2008; Waldron et al. 2000). The proportionality principle in law indicates that the extension of registration and surveillance shall be proportional to the benefits gained (Kubitschke et al. 2009). It will therefore be especially important to document the outcomes for this kind of WT.

Technology Providers and Other Stakeholders

WT for communication support use third party infrastructure suppliers of networks, devices, and technical service, which also may challenge confidentiality and privacy (Carlisle 2007) as well as the relationship between professions, e.g. when people with different (technology) competence enter the health care arena and become indispensable (van Hoof et al. 2007).

Several kinds of disease monitoring and surveillance, as well as some kinds of distance treatment technologies involve relatives and family members in various ways (assistants, contacts, super users). This can alter the relations between family members (Gammon et al. 2009) and can impose significant responsibilities with regard to health issues, which may cause stress in a way that affects the patient (Dinesen et al. 2008).

WT, and especially communication support, breaks with traditional geographical limitations, and challenges judicial norms, as well as traditional ethical principles. This may cause problems with regards to delimitating responsibility (ethically and legally), e.g. if a person in one jurisdiction receives WT from another jurisdiction (Dickens and Cook 2006; Finch et al. 2008). Even with internet based systems to support patient self-care, health personnel are concerned with legal aspects and misuse (Nijland et al. 2008). Moreover, connections with the industry that are too close pose ethical challenges in this field as in other fields (Aebi et al. 2008).

Does Welfare Technology Change People?

Technology changes the physical and social context as well as human values (Hansen and Drivsholm 2002), and calls for reflection. However, WT may also alter people more directly. Implants are technical devices providing possibilities and posing challenges (Barrocas et al. 2003), whether they are biochemical, molecular, physical or ICT based (Bauer 2007). New implants go beyond joints, pacemakers and defibrillators, and into improving function and behavior.

Even if implants may solve a series of challenges (e.g. increasing survival and safety as well as avoiding alternatives that are considered to be ethically more challenging), they may pose new ones. The most obvious are related to turning off or removing WTs (Barrocas et al. 2003), human enhancement, mental changes (Dees 2007), identity altering implants (Hansson 2005), and autonomy-changing robots. Prostheses, robots, and even direct brain connections are becoming ever more pervasive and raise a series of ethical questions (Isa et al. 2009; Mainzer 2009; Voelker 2005). Moreover, WT could make us live so long that the question of whether we have a duty to die becomes pertinent (Hardwig 1997).

It is well known that rehabilitating WT (vs. WT meant mainly to enhance), such as cochlea implants versus an amplifying hearing aid, also comprise social and ethical challenges (Balkany et al. 2001; Lehoux and Blume 2000). Referring to what is natural is difficult, as it appears to be “natural” for human beings to alter their nature (Barilan and Weintraub 2001; Boff 2006).

Identity and integrity related technologies are especially challenging (Ahmead and Bower 2008; Gillett 2006). E.g. technologies altering or compensating for intellectual functions differ from other technologies (Perry et al. 2009). This is because the decision making process for implementing such technologies is challenging, as people normally have difficulty giving real consent (due to lack of understanding or decision making capacity), but also because control and safety is transferred from the person, and from health care professionals, to the technology.

It is maintained that some kinds of WT may alter a person’s embodiment (Latimer and Schillmeier 2009): how it is to be a living body in the world (as patient, relative, employee) (Lopez and Domenech 2009). It is also argued that WT may promote a Cartesian at the expense of a phenomenological conception of the body (Lopez and Domenech 2009). Technologies that stimulate social activity (such as robot pets) are ascribed both a social and a moral status (Melson et al. 2009). Ultimately, human beings may be conceived of as self-reproducing technological units (Rabinowitz 2005). The point is that traditional distinctions between science and values, or between machine and man, may alter in a world of WT, where there is a close connection between technology and human beings (Widdershoven 1998).

Autonomy and Consent

As already pointed out, WT can rehabilitate and enhance people’s autonomy, e.g. through regained or enhanced control. This may support empowerment and people’s ability to consent (to various forms of WT). For simple (lo-tech) WT that is easy to comprehend, people may be ready to give valid express consent for the installation, placement, and use of WT.

However, some types of WT are advanced (hi-tech) or meant for vulnerable groups with reduced cognitive capabilities. This raises questions about consent. WT may also reduce people’s autonomy (and valid consent) through enhanced dependence (Lopez and Domenech 2009). Moreover, when WT is used in a private home, other people living in the same house may be involved, and hence may have to consent.

Organizational Challenges

As indicated, one of the great potentials of WT lies in its ability to cross professional, organizational, and social borders, find solutions, and create new connections. At the same time even good solutions may meet organizational constraints. Some of these may be related to professional territories (Hardey et al. 2001). E.g, professionals’ distrust in decision support systems may pose moral challenges for the implementation and use of such systems, as well as threaten the safety of patients (Alexander 2006; Perry et al. 2009). Hence, technically successful implementation of WT may not be beneficial, unless the organizational constraints are addressed (Hofmann 2002b).

Hospital at Home or Technology at Home to Avoid the Hospital?

It is widely assumed that people prefer to remain in their home if possible (Kubitschke et al. 2009). Providing health care services at home has shown to be effective, e.g. home based follow-up of stroke patients (Larsen et al. 2006). However treatment (with oxygen) can also raise safety issues (Agence d’Evaluation des Technologies et des Modes d’Intervention en Sante 2009). If advanced health technology spreads from hospitals to private homes, the challenges recognized in hospitals will spread to the home: withdrawal of treatment, autonomy to refuse treatment, advance directive (Laakkonen et al. 2004).

Some of these challenges are related to responsibility and competency. Compensation and communication technology has a distinct need for professional competency. Hospital-like technology at home may give unclear responsibilities and generate technology dependence (Arras 1994; Arras and Dubler 1994). Home respirators may be one example where traditional responsibility structures are challenged, in particular affecting relatives (Hammer 2000; Levine 2005).

The same kind of challenges is met in remote diagnostics (Manhal-Baugus 2001; Stanberry 2001). While advanced technology used in private homes appears effective from a clinical and organizational perspective (Lehoux et al. 2006), it may not be meaningful for patients and proxies (Lehoux and Law 2004). Health professionals may also feel that the potential to support and enhance patients’ agency is reduced by the technological rationality dominating WT (Liaschenko 1994, 2001).

Home is naturally seen as a place of comfort, privacy, and security, and making the private home an arena of technological intervention may make a person feel alienated and unsecure at home (Savenstedt et al. 2006). It may also be a challenging experience to be at home in a condition or situation where you normally are not at home (Huisman-de Waal et al. 2007) and finding one’s home full of technology. Personal integrity and privacy are at stake when intervention becomes invasion. “Dignity is also a core issue …, if intrusive and privacy-invading medical procedures are re-located to the home.” (Kubitschke et al. 2009). Nevertheless, there appears to be a difference in whether the hospital moves into your home or you use technological aids at home.

Advanced treatment at home may also pose challenges of prioritization, e.g. in the case of home dialysis (Alloatti et al. 2000) and enteral nutrition (Alvarez Hernandez et al. 1987). Paying attention to strong technology promoting forces may be as important in home care as in hospital care (Lantos and Kohrman 1992).

Status of Welfare Technology

Organ specific diseases in the upper part of the body detected by advanced technology have high prestige (Album and Westin 2008). Most diseases and impairments in elderly have low prestige, which may direct the research and development for WT. Correspondingly, as the status of the most vulnerable groups may be low, the eagerness to develop and implement technology may be lower for these groups (Gentry 2009). Even though the elderly will be much more used to technology in the future, the question of whether WT is a “quick fix” is still relevant. There may also be areas and tasks that are not suitable for WT (Cunningham 2006; Sparrow and Sparrow 2006; Godwin 2005).

The Challenge of Ignorance

The lack of high quality evidence on the effectiveness and efficiency of WT, as well as on how people live with WT (Gaikwad and Warren 2009; Gentry 2009; Garcia-Lizana and Sarria-Santamera 2007; McGowan et al. 2009; Maric et al. 2009; Pare et al. 2009) is ethically challenging. WT may be implemented without rigorous testing, because the regulation of medical devices is more lax than the regulation of drugs (Wilmshurst 2011).

Basic Challenges: Technological Possibilities and Limitations in Perspective

So far most WTs are not advanced, hi-tech, and resource demanding like the technology that usually poses ethical challenges. It does not alter basic biological properties, such as DNA, or raise significant issues, such as euthanasia. Still it poses some significant challenges.

One reason for this may be that relating human welfare to technology violates intuitions and traditions, e.g. the ancient distinction between eudaimonia and techne which has been historically persistent (Hofmann 2002a). However, modern man appears to be constituted by technology, and our welfare and happiness is intimately connected to the existence and use of technology.

WT therefore raises the profound question of the good life and being human, indicating why such “ordinary” “lo-tech” WT may pose ethical challenges. This may explain the diverse reactions to WT, where some embrace it while others appear to be hostile, and why some think that words like care and welfare are part and parcel with technology, and others consider them to be contradictions in terms. E.g. WT is heavily criticized because it may be used as a replacement for proximity, care, and human relations (Melson et al. 2009; Percival and Hanson 2006; Sparrow and Sparrow 2006). Correspondingly, one can ask whether WT raises a particular perspective on human welfare. Technology may direct attention towards instrumental values, productivity and efficiency, and away from other phenomena important for human welfare, such as hope, coping, vulnerability, dignity, and meaningfulness.

The controversies on WT appear to follow the divide between “the two cultures” (Snow 1959), between the scientific and the humanistic, between explanation and understanding, between the instrumental and the relational. Whether WT will enhance or bridge such distinctions is a key question (Widdershoven 1998), still to be decided.

The Normative Classification of Welfare Technology

WT may be classified in many ways, and expresses normative preconceptions and constraints. Classifications according to type of technology (robots, sensors, IT) direct our attention towards technical and industrial aspects, while classifications according to purpose and function, guides our awareness towards the ends. Taxonomies based on target groups and institutions may be fruitful in assessing and addressing conflicts of interests and justice, but may ignore other aspects. One way to highlight ethical challenges is to classify WT according to function and intention (Hofmann 2002c, 2006).

Conclusion

This literature review gives an overview of ethical challenges of new and emerging welfare technologies. It illustrates that WT is a heterogeneous class of technologies that have to be assessed individually. However, some general challenges are identified. First, WT is a class of technologies that are likely to be applied in people’s home, posing problems with alienation and feeling safe. Here Martin Heidegger’s term uncanniness (Unheimlichkeit) may be relevant (Heidegger 1977). Second, WT involves many stakeholders posing questions of a) who will gain from WT, and b) who is responsible for implementing, using and maintaining WT. Third, many WTs involve third-party actors with access to sensitive information, like service providers and relatives, which poses challenges to confidentiality and privacy. Fourth, as most technologies are related to prestige, are not distributed equally, and can cause social discrimination (e.g. such as a “digital divide”), WT raises questions of justice, especially since it is intended to be used on a broad scale. Fifth, WT challenges basic conceptions of humanity, such as vulnerability, dignity, and care. It may represent a colonization of the human life-world by an instrumental rationality, to phrase it with Habermas (1987).

These ethical issues should be taken into account when developing, implementing and using welfare technologies. Hopefully this review can be a useful point of departure when assessing particular WTs in context.