Keywords

1 Introduction

Ethics may once have been associated with Aristotle but today advanced technology is generating ethical problems unique to our own age. Should robots become personal assistants? Should they have human rights? Under what conditions should we monitor Grandma? If human enhancements using prosthetic limbs, steroids or genes give an athlete an advantage, should he/she be permitted to compete in the same event against those without these enhancements? What if enhancements relieve the necessity for institutionalization but the trade-off is compromising identity?

We live in an age of technology. In our age, ethical questions often arrive wrapped in complex technologies. This chapter focuses on three categories of advanced technologies, robotics, sensors used to monitor and track people, and human enhancements. While Disability Studies has critically analyzed Bioethical problems such as those relating to abortion, suicide, pre-natal screening and genetic testing (Ash et al. 2008; Wasserman and Asch 2009), it has been less attentive to discourse leading to a disability perspective on the development and use of these and other advanced technologies (Seelman 2001). While both assistive and advanced technologies can be enabling, only the latter can change the nature of what it is to be human. The impact of their use can obscure the boundaries between human beings and machines and render obsolete current distinctions between cure and independent living and therapy and cosmetic interventions. Participation by citizens in decisions about the design, development and use of the artifacts of science and technology is a major concern for democracy (Jasanoff 2011). Consciousness of the nature of advanced technology and the scope of its impacts, both existing and potential, can make us very uncomfortable. Nonetheless, people with disabilities have both rights and responsibilities as citizens and in the case of advanced technologies as targeted markets of end users. This chapter is a preliminary exploration of the need for more broad based participation by the disability community in critical analysis of science and technology development and use.

A number of intellectual movements and approaches are relevant to these problems. The Transhumanist movement (Bostrom 2005), for example, affirms the possibility and desirability of fundamentally transforming the human condition and therefore eliminating aging and by association, disability. The Ableism approach (Cambell 2008; Siebers 2008) operates in the here-and-now, taking aim at discrimination on the basis of physical or mental differences, and attitudinal and physical barriers to equal opportunity. Finally, more phenomenological and subjective approaches akin to Feminism, direct us to focus on our own experiences as people with disabilities (Shakespeare 2006). A scientist with a seemingly renaissance mind, Gregor Wolbring, the son of a parent who used thalidomide, counsels us to focus more on citizenship (Wolbring 2012). People with disabilities should join the discourse.

While a comprehensive critique of advanced technology would require exploration of vast unmapped and unexplored intellectual and experiential terrain, this chapter has more modest dimensions and more concrete aims. Advanced technologies are examined within the context of ethics, especially values important to people with disabilities such as self-determination and human rights and related principles in Bioethics. The advanced technologies which are introduced later have been chosen more on the basis of their relevance to the here-and-now than on science fiction. They are on the market or soon to be introduced into the marketplace (Chen 2012). They include robots for kids with autism and brain- computer interfaces for those with Amyotrophic Lateral Sclerosis (ALS). Advanced technologies such as robotics, sensors used to monitor and track people, and human enhancements are not routinely covered through Medicare and Medicaid. However, they may be available in the course of research projects using as research subjects people with disabilities and older adults, in out-of-pocket sales direct sales transactions and as educational tools.

Popular culture and the scientific literature provide interesting representations and evidence about the social “nature” of these technologies and the experience of people with disabilities who use them. Technical factors include new materials such as those used in prosthetic feet, design processes which may or may not enable consumer choice options, and introducing genetic materials into the body that may alter human intelligence. Social factors involve acceptance by the individual and adoption by society. Cultural representations in film and evidence from scientific studies are culled to describe the technical and social characteristics of robots, sensors and human enhancements. Applying values such as self-determination and human rights from disability rights and related traditional ethical principles in western bioethics (Beauchamp and Childress 2009) to actual case studies, we then move on to identify and examine some of the ethical problems experienced by users in order to discern the contours, if not the core concepts of a disability perspective.

2 Significance

Advanced technologies have more profound, long-lasting effects on human beings and society than their predecessor assistive technologies. As later sections of this chapter will illustrate, they can substitute for human judgment and affect the way people think and feel. Ethical dilemmas (Hamric et al. 2000) arise from the options generated by the enabling features of advanced technology. An individual may choose to supplement a malfunctioning memory by borrowing memory and accepting support from a robot even if the robot is controlled by undetermined sources outside the end user’s network but within the robot’s communication network. The other option available to the individual is often institutionalization. Ironically, this non-medical intervention for supplementing and supporting a person with a disability may lead to cure if enough memory is borrowed so that the functioning that is enabled approximates that of someone without memory loss. What is the line between maintaining one’s identity while using the memory of another and inadvertently taking on the identity of another or at least succumbing to the control of an unknown other? These problems blend well with colonization and identity which are themes of our book. Colonization involves absorbing and assimilating people into the culture controlled by a more dominant group and thus destroying any remnant of the less powerful culture. Depending on scale and other factors, experiences such as borrowing memory may or may not severely harm the disability community and compromise the identities of for example previously free persons with disabilities. The disability studies community may find problems such as these worthy of further critical analysis.

2.1 Advanced Technology and Health Care

Advanced technologies have been introduced into health care to enhance quality of life. (Helal et al. 2008; Seelman et al. 2014; Stephanidis 2009). These assistive technologies may be used for therapy and support or they may be cosmetic and aimed at a level of performance which, without them, would be impossible to achieve. People with disabilities are considered an important market in health care and advanced technologies (Chen 2012; “Wheelchair Toyota’s Robot” 2012; Wolbring n.d; O'Reilly 2012). Industries actively advertise enabling features of products, describing their positive impact on the quality of life of older adults, people with disabilities and their caregivers. Government makes decisions about the safety and effectiveness of many of these products. However, the implications of these technologies for the quality of life of people with disabilities await the advent of widespread discourse in the disability community.

2.2 Human Rights, Bioethics and the Challenges of Containing New Technology in Old Wineskins

Self-determination, human rights and justice are fundamental values for the Disability Rights Movement. The right to universally designed and available technology is widely accepted by the disability community. These values are incorporated in the UN Convention on the Rights of Persons with Disabilities and serve as the moral foundation of the Independent Living Movement. Human rights laws and conventions are the first line of defense against discrimination. Bioethics, related laws, regulations and codes of practice in the health professions form a second line of protection.

The Hippocratic Oath may be the most familiar application of ethics in science and technology. The Hippocratic Oath is an enduring example of the value placed on beneficence as a foundation of modern health care. Beneficence counsels that the basic responsibility of the clinician is to do good for his patient. Bioethics is rooted in human biology and provides guidelines for the ethical behavior of human beings. The Oath assumes that only humans are moral agents, therefore, robots are excluded but as an example provided later from the movie Robot and Frank suggests, robots may struggle with human dilemmas such as choice. Over the centuries the health professions have developed a comprehensive biomedical ethics framework of patient protection, well-being, and confidentiality of information. Informed consent, for example, is the process by which a fully informed patient can participate in choices about her health care. It originates from the legal and ethical right the patient has to direct what happens to her body and from the ethical duty of the physician to involve the patient in her health care. Institutional Review Boards approve, monitor, and review research involving humans. Researchers and clinicians have adopted basic principles to guide them in protecting research subjects and as a basis for professional codes for ethical practice. They include: respect for persons and autonomy, beneficence, non-maleficence or do not harm and justice (National Commission for the Protection of Human Subjects of Biomedica and Behavioral Research 1979). These principles augmented by self-determination and human rights will be used to examine the case studies presented later.

Adoption and use of advanced technologies has breached the framework of Bioethics and governmental regulation so that the flow of risks and benefits associated with them are largely undirected. Today, with the introduction of advanced technology and globally competitive industries into health care and the influx of engineers and other non-health professions, the field of Bioethics no longer provides a comprehensive framework to guide behavior and decision making. Ethics relevant to health care has diverged into many subfields including bioethics (Beauchamp and Childress 2009), computer ethics (Johnson and Miller 2009) and human machine ethics (Veruggio and Operto 2008). There is no integrated ethics with which to approach a disability perspective on the use of robotics, sensors and enhancements. Nor have ethicists and governments taken head-on the problem of cultural differences within the context of advanced technology. Other parts of the world and corporate culture may be more pragmatic and less identified with Western ethical standards. On the one hand, criteria for technology design, adoption and use decisions may be reduced from ethics to technical standards for safety and effectiveness of a device (Veruggio and Operto 2008). On the other, the Disability Community has enshrined the human right to universal design of technology in the UN Convention.

3 Method

Two approaches are used to identify and examine the ethical dimensions of advanced technology—outside-in and inside-out. Using popular culture and scientific literature, the outside-in approach introduces some of the technical and social characteristics of robots, sensors used for monitoring and tracking and of enhancements. Ethical concerns about personhood, identity and subsequent colonization and discrimination are identified.

Applying the inside-out approach, we turn to a process of identifying and examining ethical problems within consumer case studies, using a principle-based Common Morality approach from Bioethics (Beauchamp and Childress 2009). A common morality community is formed based on acceptance of principles such as respect for persons (autonomy), beneficence (non-maleficence) and justice. Common Morality assumes that only human beings can be moral agents, denying agency to those with artificial intelligence such as robots. While principles such as autonomy are content-thin, the case studies are content-rich; the combination of both provides a basis for fixing the contours of the ethical dimensions of problems generated by these technologies. The common morality approach encourages adaptability. If the results from examination of the case studies suggest that existing norms for ethically obligatory behavior are insufficient, then additional guidelines may be proposed. This adaptability is particularly important when involved with new technologies such as robotics, tracking and monitoring and enhancements, because standards governing their use and limiting their impacts may not have been developed or are evolving.

As the previous discussion suggested, Bioethics is a pail full of holes and a can full of worms. While no longer sufficient to contain and guide human decisions about advanced technology, it is a necessary tool. Rooted in Western culture, Bioethics may yet be enhanced to construct an adequate ethical framework for the design and use of advanced technology used in health care and independent living.

4 What Are Advanced Technologies: Implications for the Disability Community?

Advanced technologies almost always involve information and communications technology and computers, standing alone or embedded in other products and networked into global transmission systems. They are also characterized by sophisticated electronics, software, robotics and artificial intelligence (Seelman et al. 2014). Guided by algorithms, rather than human beings, software routinely instructs these technologies on what to do (Peterson 2011.). Algorithms are step-by-step problem-solving procedures for decision making. For example, algorithms may bar, limit, or enable end user options to control technology such as those that can be incorporated into a robotic mobility device. These mobility devices—the next generation electric wheelchair—are introduced in the first case study.

4.1 Robots

Robots can be viewed as only being machines, as having a moral dimension, or even evolving into a new species (Veruggio and Operto 2008). They are stand-alone or embedded in other technology. Exoskeletons, for example, use robotic systems, advanced battery technology and materials. People, who otherwise could not walk, slip into them for mobility support (Chen 2012).

Movies such as Blade Runner have acquainted us with bad robots that illustrate the conflict between robotkind and humankind. Perhaps less ethically challenging are good robots such as a little robot named Bandit whose purpose is to help children with autism better understand social cues and emotional behavior (Conley 2011). Movies such as Robot & Frank introduce us to the complexities of distinguishing a good robot from a bad robot. The robot’s cognitive and communication capabilities are more in the realm of Science Fiction; nonetheless, it provides a robotic rendition of a future personal assistant.

In Robot & Frank, Frank is an aging former petty thief with progressive memory loss who lives alone but has a family who cares about him. Against Frank’s wishes—but better than the other option of living in a facility—Frank’s family responds to their concern about his welfare by providing him with a robot health aide. The robot is referred to as Robot. Neither a human being who by birth receives a name and social security number nor an anonymous machine which often remains nameless, Robot reflects moral ambiguity in valuing an advanced technology characterized by both human and non-human features. At first, Frank regards the health aide as a stupid appliance. However, Robot is a very personable robot who assumes a role akin to a buddy, but not always an empowering buddy. He does not provide Frank with a “turn-off” switch. No information is provided about who controls the Robot or has access to its memory. Frank accepts the robot even though he has not been protected by informed consent which is routinely required for research subjects and patients in medical settings. Informed Consent is closely aligned with medical services. However, the services provided by Robot are categorized as non-medical.

Robot supplements Frank’s memory and “lends a hand” around the house. He helps Frank with personal and housekeeping activities such as eating, cleaning and routinizing schedules and healthy behaviors such as rising in the morning, exercising and eating vegetables. While Robot is diligently and sociably performing his tasks, Frank discovers that the robot does not have software, which would provide ethical guidance such as that thou shalt not steal. Frank regains a sense of his youthful self when he realizes that with the aid of Robot he once again can engage in petty theft and robberies. However, after executing these robberies and because of his past criminal record, Frank comes under suspicion by a tech savvy victim who realizes that the evidence to prove Frank’s guilt lies in the memory of Robot. Will Robot provide Frank with the secret to wiping out his memory; will Frank push the erase button? Well, see the movie! However, even without viewing the end of the film, we are made aware of the ethical problems involved with establishing boundaries between human beings and machines, sustaining individual identity in face of the threat of colonizing the human mind as well as invasion of privacy. Thus, independent living for Frank is, at best, a matter of shared decision making with those who may supersede his choices or fabricate his choice options. At what stage should the Robot’s own narrative be given more than technical and economic value? Some of these problems involve self-determination and ethical principles such as autonomy and non-malfeasance which instruct us to respect personhood and do no harm. Others await the development of a new dimension in ethics.

4.2 Sensors Used to Monitor and Track People with Disabilities Across the Age Span

The second case study catapults us into applications of sensor technology in the life of an older adult. Sensors are devices capable of detecting and responding to physical stimuli such as movement, light, or heat. Sensors can be located within the body, on the body and in the environment. The U.S. Food and Drug Administration, for example, recently cleared a tiny ingestible sensor used with a companion wearable patch and mobile app to improve medication adherence (Pogorelc 2012). Sensors collect and transmit information, such as blood pressure and about falls or going to the bathroom, often wirelessly to external devices such as smartphones which, in turn, may transmit this information to clinicians, family members and caregivers.

Perhaps the most familiar use of sensors is for monitoring and tracking older adults. Modern home automation and communications systems often provide a visual interface that makes it easy to stay connected with aging or disabled loved ones from anywhere in the world. Sensors can be installed on entryways, chairs, medicine cabinets, cupboards so that a family member or caregiver can be notified automatically of any unusual activity or patterns which should cause concern. They can make life safer by alerting occupants to phones ringing and someone at the door. If they are coupled with the use of locational technology, such as a geographic information system (GIS), then information about location also becomes available. Sensors may also be coupled with video technology. Again there are ethical dilemmas. Just how much autonomy and privacy are you willing to trade-off for independent living?

4.3 Human Enhancements

The scope of human enhancements is expansive incorporating many individual advanced technologies and combinations. Robots, for example, can be categorized as memory enhancements. Therefore, they are difficult to categorize and encapsulate in a brief description. The impact of enhancements on human beings can be profound, encompassing a broad range of changes in human nature and function. They are used to temporarily or permanently overcome the current limitations of the human body through natural or artificial means. Enhancements range from the familiar, such as prosthetics and steroids to the less familiar, brain and gene implants (Hamilton et al. 2011; Hanna 2006). Their impacts range from no long-term effects on the body and mind as in most limb prosthetics to effects that make someone not just well, but better than well, by optimizing attributes or capabilities -- perhaps by raising an individual from standard to peak levels of performance using genetic enhancements.

Bioethics has been used as a platform for debates involving the ethics of advanced technology in therapy versus their use for performance and cosmetic enhancement (Resnik 2000). On the one hand, brain implants, for examples, have considerable potential for pinpointing and shutting down seizures caused by epilepsy (Stimson 2011). On the other, an international movement called Transhumanism affirms the possibility and desirability of fundamentally transforming the human condition by developing and making widely available technologies to eliminate aging and to greatly enhance human intellectual, physical, and psychological capacities (Bostrom 2005). People with disabilities would no longer exist.

The products of popular culture reflect various ethical concerns about human enhancements. The film Gattaca features the genetically superior valids and normal humans, known as invalids. The valids get all the high-paying jobs, and practically run the country, while the invalids are shown as janitors and other menial workers. In other words, the direction of discrimination runs against normal humans. A two-tiered system positions normal humans as outcasts and valids as dominant. We shall return to this concern for fairness and justice in the case study of the South African runner, Oscar Pistorius.

5 The Case Studies

The case studies that follow are composites of real experiences of people with disabilities. Acceptance or rejection of the technology is influenced by their own medical and functional status as well as the concerns of their caregivers and family. The case studies introduce technologies that are more assistive in nature, so that their impacts do not involve changes in what is generally regarded as human nature. Dilemmas arise when trade-offs provide no easy and highly desirable option.

5.1 A Young Man with Spinal Cord Injury; Let’s Call Him Jake Adams

Jake Adams sustained a serious spinal cord injury in an automobile accident and uses a power wheelchair for mobility. He has limited use of upper extremities and some cognitive involvement. Jake has been involved in research to develop a robotic mobility product (“Quality of Life Technology Engineerng Research Center”). The robot will be equipped for mobility and manipulation so that Jake can use the robot’s arms to open doors, including refrigerator and microwave doors enabling him to eat independent of human assistance. The research is being conducted at a university setting well known for its commitment to participation by people with disabilities and universal design.

The designers have set a goal aimed at producing a device in which there is a person-system symbiosis. The mobility device components will be designed to enable independence. Users can operate the device in multiple control modes:

  • Autonomous control mode in which the user can specify an activity of daily living (ADL) task, such as where the mobile base should go and what object the robotic arms should manipulate;

  • Local control mode in which the user can fully access the control of both mobility and manipulation;

  • Remote control mode, a remote user, for example, the caregiver, is able to remotely complete ADL task for the local user;

  • Cooperative control mode, in which the amount of work provided by the user in the wheelchair and the amount of work provided by the caregiver through remote operation.

The range of control modes provides Jake and his caregivers with incentives for independence and for cooperation. The remote control mode will require communication and other capabilities to provide a service not unlike the remote telephone relay services which provide deaf, hard of hearing and speech-impaired customers with telephone service. However, these control options and other features generate privacy issues. The robotic mobility device utilizes cameras, sensors, WiFi and other technology which collect data for physical location (GPS), and from visual and auditory recordings. Who has access to the data—caregivers, clinical providers, insurance, and for what purpose? Is the vast information transmission system secure so that the data is held confidential? Safety issues also emerge in the use of the remote control mode. For example, a remote user could open the front door to allow a robber to enter.

In this case study, the engineers have adhered to the principles of autonomy and beneficence and malfeasance by designing technology that enhances independence and seems to be good for the individual. The end user can decide who controls what. If Jake receives training in use of the device and if he is apprised of the privacy and safety risks involved in the use of the various video, locational and communications system and agrees to accept them, then the researchers have met their ethical responsibilities. Justice, involving both ethical distribution of risks and benefits, is not within the authority limits of research but nonetheless, the research team should regard themselves as advocates for fair reimbursement policies.

5.2 An Older Woman Who Is at Risk for Falling and Is Sometimes Forgetful; Let’s Call her Eve Jones

Mrs. Jones is a fiercely independent 85-year-old who is determined to continue living in the home in which she and her husband, now deceased, raised their children. She has a heart condition and arthritis for which she takes medications when she remembers. The meds sometimes makes her feel dizzy, affecting her balance.

Her children are worried about her. They, and her physician, have tried to convince her to move into a more protected environment but she has refused to budge even though she has a number of conditions which put her at risk of injury. Mrs. Jones’ older daughter has pleaded with her to accept some surveillance equipment. The daughter’s investigation about costs shows that the family can afford the equipment. Human tracking equipment is now affordable and available without restriction for $200 plus a monthly service fee of $20. The equipment is often marketed as “kid-tracking” devices, though some ads also mention pets and senior citizens. (“Human tracking: Big Brother goes mainstream” 2005). This equipment, which does not involve a prescription and which is located in the home is not regulated so as to require privacy protections akin to informed consent in research and health care.

Mrs. Jones views video cameras as particularly repugnant because they are ubiquitous and highly invasive. She rejected video cameras outright, especially for the bedroom and bathroom. She does not want images of her intimate activities captured and shared. Mrs. Jones claimed that her occasional memory lapses do not justify the use of GPS and ingestible sensors. However, she did acknowledge that she might receive comfort by some sensor monitoring, especially for falls and also alerts to incoming telephone calls and people at the door. Her family, however, wanted her to accept more extensive surveillance.

The question of who has the right to choose is made more complex because Mrs. Jones is at risk for physical injury. However, she is mentally competent. These problems correspond to the principle of respect for persons and autonomy. However, under conditions of considerable risk of injury, negotiations among the parties are necessary and justified. Does justice require that society place restrictions on the adoption and use of the equipment and on the content of advertising?

5.3 Human Enhancements: The Case of the Long Distance Runner

Oscar Pistorius was born in 1986 in South Africa, with congenital absence of the fibula or calf bone in both legs (“Pistorius, Oscar” n.d.). When he was 11 months old, his legs were amputated halfway between his knees and ankles. Nonetheless, he became a world class runner, capturing his first gold medal at the Paralympics in London in 2006 when he helped South Africa win the 4x100m relay in a world-record time at the Olympic Stadium. Pistorius uses high energy-storing prosthetic feet products known as Cheetahs. Viewing Cheetahs as providing an advantage over another athlete not using such a device, in 2008, the International Association of Athletics Federation banned him from competing in the Olympics. The ruling was overturned after Pistorius challenged the Federation because of lack of evidence.

Evidence, and some would argue, fairness, may support the admission of Oscar Pistorius to Olympic competition. However, the impact of high tech prosthetic feet presents a vivid contrast to the seemingly dangerous impacts of genetic enhancements and their impact on human nature. What conditions and criteria justify the individual acceptance and societal adoption and use of these enhancements? Can these criteria harmonize with principles of respect for persons, beneficence, maleficence and justice?

6 Conclusions

Should Robots be Personal Assistants? Clearly criteria must be predicated on response to one or the other of two categories of advanced technologies. Technologies in category 1 provide support but do not make fundamental changes in human nature. These technologies, for the most part, correspond to and are embedded in an ethical framework of disability rights and Common Morality. Technologies in category 2 may change human nature and exist in a context without a widely accepted ethical framework. Decisions about technology in category 2 are market driven. People with disabilities should be involved in questions of who defines the good or sanctions the bad for these technologies. They have a large stake in who loses and who gains.

Technologies in category 1, such as the robotic mobility device, the surveillance equipment and the Cheetah prosthetic feet introduced in case studies generate many unresolved ethical problems. While not widely acknowledged in Ethics, technology has social dimensions as in adopting a process of designing technology to incorporate user choice into a robotic mobility device or impacts of a device on privacy. While individuals who are subjects of research and patients in the health care system have privacy protections such as informed consent, no equivalent protections exist for people who want to live independently but must choose between surveillance equipment in their homes or facility-based living. Rarely, does the scientific literature report findings of studies of personal assistance which includes human personal assistants as an option equal to that of technology.

Tier 2 involves technology for which the ethical contours are emerging. As with the robot in Robot and Frank and genetic enhancements, these advanced technologies do not fit comfortably with currently accepted values. Self-determination, human rights and principles of Bioethics were developed to apply to human beings—not to those entities which have human, super human and non-human attributes. Tier 2 technology involves trade-offs between identity and colonization of personhood and realization of medical cure and cognitive, physical and sensory functioning and super functioning. Some adults and parents of persons with disabilities may desire or require increments of support, as in the case of borrowed memory, which may cure symptoms but colonize minds. This is not self-determination and independent living as we understand it today. Perhaps through the lens of a person with dementia or the parent of a child with cognitive disabilities, the option should be made available. If so, the framework for dispensing this advanced technology must involve strict criteria for use within a context resembling the health professionals.

Disability theory has used the social model of disability to explain disability. Ableism theory has provided impetus for the important work of dismantling discrimination. Perhaps, theory should be developed based on disability narrative which could inform experience with some of these advanced technologies. Under what conditions do harsh terms such as colonization apply as descriptors of the impact of these technologies on the lives of people with disabilities? New theory must emerge to explain why and under what conditions we should or should not accept, adopt and use Tier 2 advanced technology. Therefore, the question, “Should Robots be personal assistants?” remains in the active file for follow-up!