Keywords

1 Software Process Improvement

Effective development and implementation of systems is the result of proper interplay between (1) humans-centered aspects, specifically, socio-cultural aspects; and (2) technology. Thus the study and optimisation of these should be conducted concurrently. There is a realisation in industry, commerce and government that the application of new software methodologies and technologies have failed to realise the desired gains in productivity and quality. There is recognition that the significant problem is the inability to manage the software process. There have been a significant amount of software process improvement efforts. Notable software process improvement standards and models from consulting firms include:

  • The Capability Maturity Model Integration [CMMI], the replacement to the older CMM, developed at Carnegie Mellon University (CMMI 2010);

  • The International Standards Organisation’s 9001 Specification [ISO 9001] (ISO 2016);

  • The ISO/IEC 15504 IT Process Assessment, aka Software Process Improvement and Capability Determination [SPICE] (ISO 2012);

  • Six Sigma, the data driven leadership approach (Pyzdek 2003);

  • The 730-2014 IEEE Standard for Software Quality Assurance Processes, which establishes the requirements for initiating, planning, controlling, and executing the Software Quality Assurance processes of a software development or maintenance project (IEEE 2014).

Central to each of these improvement models is the notion of a focused and sustained effort towards building a process infrastructure of effective software engineering and management practices. The SPI strategy aims for something that is more focused, more repeatable, and more reliable, with regards to the quality of the system developed (conformance to requirements, reliability, usability etc.), the timeliness of delivery and the expected cost. Also quality in use has implications on performance, reliability, and usability. Quality can also be understood in the context of SQuaRE (Software product Quality Requirements and Evaluation), a more extensive series of standards to replace ISO/IEC 9126 (ISO/IEC 2011). ISO 25010 has a greater number of product quality characteristics and sub characteristics, in contrast to ISO 9126, including such aspects as effectiveness, efficiency, user satisfaction and societal impacts. The overall assumption is that a sound and improving process is likely to result in high quality systems i.e. process improvement is likely to result in improved products. However, if the organisational culture and practices allow for unethical behaviour SPI efforts are likely to fail.

1.1 Professionalism and Health & Safety

When computer professionals begin work, they typically enter into relationships with one or several of the following: employers, clients, co-professionals (or the profession as a whole) and the public (Johnson 1995). Literature reviews suggest that with regards to IS development and implementation projects, more often than not, the relationship between the professional and the client is predominant, which often invokes ethical and professional issues that can determine the success or failure of such projects. Clients are heavily dependent on software and hardware suppliers for accurate, honest and open information, alongside sound and objective advice. This dependence creates special obligations for the vendor to be conscientious about advising clients.

Health and safety and the wellbeing of end users of the systems we develop as engineers are a problematic challenge for countless developers and organisations. Health and safety management is the process of identifying and minimising threats to workers and those affected by the work throughout the project, programme and portfolio life cycle. Health effects associated with the use of computer technology has important implications because of the prevalence of work with IT equipment in various forms. Special implications for developers of systems and users of the deployed solutions are invoked and these are discussed in this paper. There is UK legislation governing health and safety in the workplace.

1.2 Computer Ethics

It is generally recognised that law and morality do have in common certain key principles and obligations. Thus the law will clearly apply and lead directly to the appropriate ethical conclusion. However, to rely solely on law as a moral guideline is clearly dangerous because in certain circumstances bad laws exist (Kallman and Grillo 1996) (Spinello 1995). Inadequate laws may bind rules on society that fail to provide moral guidance. Such laws may, in some instances, excuse a society from fulfilling certain obligations and duties, or allow a society to justify their unethical behaviour. Ethical judgments simply do not have the same deductivity and objectivity as scientific ones. However, moral judgments should be based upon rational moral principles and sound, carefully reasoned arguments. Spinello (1995) states that normative claims are supported by: “An appeal to defensible moral principles, which become manifest through rational discourse”.

A normative claim can only be substantiated, and a rational discourse presented, through an appeal to such principles. Thus, with regards to the ethical issues raised by systems development and deployment, in Sect. 2 of this paper we will present a list of defensible ethical principles, which are taken from ethical theory. In Sect. 3 the authors identify the current issues concerning health and safety in the systems development and deployment process. Computerized information systems have brought with them new health and safety hazards, and these will be identified, alongside the issues pertaining to discrimination in the workplace. A number of heuristics are suggested in Sect. 4, which if followed may lead to ethical guidance concerning health and safety in the systems development and deployment lifecycle. These normative claims are substantiated via the citation of one or a number of the ethical principles from Sect. 2. Thus each heuristic is based upon rational moral and philosophical principles and sound, carefully reasoned arguments.

1.3 SPI Manifesto

The Software Process Improvement (SPI) Manifesto consists of three values and ten principles, which serves as an expression to state-of-the-art knowledge on SPI. In planning a SPI project, these values and principles can be embraced in order to better facilitate the necessary corresponding change in the organisation (EuroSPI.net 2010).

The argument put forward in this paper that we, as SPI professionals, need to fulfil ethical duties concerning the health and safety, and the wellbeing, of end users of the developed and deployed systems correlates with the values outlined in the SPI manifesto. The SPI values: must involve people actively and affect their daily activities; is what you do to make business successful; and is inherently linked with change. The corresponding principles that are fleshed out, based on these three values, serve as foundations for action. The notion of health and safety, and the wellbeing, is implicitly implied in the SPI Manifesto values and principles.

2 Defensible Ethical Principles

There are a range of ethical theories that have been developed throughout history and one or a combination of these can be selected. Fundamentally there are two basic approaches to ethics: Teleological theories (consider the consequences of an action as a measure of goodness) and Deontological theories (emphasise the rightness of an action above the goodness it produces).

Kallman and Grillo (1996) present a framework for ethical analysis. Amongst, a multitude of other details, it lists some basic moral principles and theories that can serve as normative guidelines for addressing the moral issues, cases where ethical and professional issues may have been invoked. The framework also advocates the steps that are required in order to conduct an ethical analysis. The following sub-sections enumerate these principles that have been sourced from ethical theories, including Teleological and Deontological ones.

2.1 Deontology

Kallman and Grillo (1996) state that deontological ethics, often referred to as deontology, is the normative ethical position that appraises the morality of an action based on rules. Deontology is at times described as duty or obligation or rule based ethics, because rules bind an individual to their duty.

According to Ross (1930) duty Based Ethics (Pluralism) can be viewed as seven basic moral duties, which are:

  1. 1.

    One ought to keep promises (fidelity)

  2. 2.

    One ought to right the wrongs that one has inflicted on others (reparation)

  3. 3.

    One ought to distribute goods justly (justice)

  4. 4.

    One ought to improve the lot of others with respect to virtue, intelligence, and happiness (beneficence)

  5. 5.

    One ought to improve oneself with respect to virtue and intelligence (self-improvement)

  6. 6.

    One ought to exhibit gratitude when appropriate (gratitude)

  7. 7.

    One ought to avoid injury to others (non-injury).

According to Kallman and Grillo in Rights-Based Ethics (Contractarianism) there are three fundamental rights. Hamelink (2000) identified, and appended, a further seven to give the following list of rights:

  • The right to know

  • The right to privacy

  • The right to property

  • The right to security

  • The right to political participation

  • The right to freedom of expression

  • The right to freedom of association

  • The right not to be discriminated against

  • The right to fair access to, and development of, communication resources

  • The right to protection of cultural identity

2.2 Teleology

In contrast to deontology, teleology describes an ethical perspective that asserts the rightness or wrongness of human actions is based exclusively on the goodness or badness of their consequences. Therefore, teleology views actions as being morally neutral when considered apart from their consequences. Kallman and Grillo (1996) identify three philosophies under the umbrella of teleology:

  • Ethical Egoism: Moral agents ought to do what is in their own self-interest

  • Utilitarianism: Operating in the public interest rather than for personal benefit; maximises benefits over costs for all involved, everyone counting equal

  • Altruism: In benefit for others, even at a cost to yourself.

2.3 Further Normative Principles

  • Principle of Autonomy, According to Immanuel Kant’s moral philosophy, for an individual to be truly human, that person must be free to decide what is in his or her best interest.

  • Principle of Informed Consent: The Kantian approach affirms that someone has given agreement freely to something (Kant 1785). For such an assent to have significance, it should be informed, that is, based on accurate information and an understanding of the issues at hand. If this information is deliberately withheld or is incomplete because of carelessness, then the consent is given under false pretences and is invalid.

  • Golden Rule, “What you do not want others to do to you, do not do to them.”

  • The US Content Subcommittee of the ImpactCS Steering Committee (Huff 1995): Quality of life; Use of power; Risks and reliability; Property rights; Privacy; and Equity and access.

The appropriate and respective normative principles presented above will be applied to the moral dilemmas that are invoked by systems development and deployment by business process engineers, software engineering teams, process improvement managers, etc.

3 Health and Safety Considerations at Work

The principal UK legislation governing health and safety in the workplace, including the use of computers is the Management of Health and Safety at Work Regulations 1999. The Regulations were introduced to reinforce the Health and Safety at Work Act 1974. The MHSWR places duties on employers and employees including those who are clients, designers, principal contractors or other contractors (Gov.UK 2018a). Care must be taken that employees are not exposed to radiation from monitors. Provide adequate and appropriately designed equipment and furniture to minimise injury risk. Guidelines for creating and maintaining adequate working environments, including lighting and ventilation in offices, and the conditions under which computers are used, including the appropriate frequency and length of breaks for those working at computer terminals for long periods. To battle the problem of occupational injuries and diseases ISO developed the ISO 45001standard: Occupational Health and Safety (ISO 2018). It stipulates requirements, which will help organisations reduce this burden by providing a framework to improve employee safety, reduce workplace risks and create better, safer working conditions, including the software/systems engineering industries.

Under the Equality Act 2010 (Gov.UK 2018b) disability is defined concisely and succinctly as “a physical or mental impairment that has a ‘substantial’ and ‘long-term’ negative effect on your ability to do normal daily activities”. The term ‘substantial’ is more than minor or trivial, for example it takes much longer than it usually would to complete a daily task like getting dressed; and ‘long-term’ means twelve months or more, for example, a breathing condition that develops as a result of a lung infection. There are a range of disabilities, including visual, auditory, physical, speech, cognitive, language, learning, and neurological disabilities.

3.1 Computers and the Workplace

Sauter and Murphy (1995) investigated the changing structure of work in our society and presents empirical research studies, which pointed to computerized information systems have brought with them new health and safety hazards. The authors argued that the computerisation of office work has resulted in increased levels of stress for workers. It was concluded that stress in the modern office, in particular where computerised monitoring and surveillance systems were implemented and utilised, could lead to, amongst other things: loss of job satisfaction; low morale; and absenteeism and poor employee-management relations.

Duquenoy et al. (2008) postulate that increased interaction with computers, instead of people, has led to a: reduced sense of personal responsibility in the modern office; resulting sense of anonymity and depersonalisation can cause a lack of respect for an organisation and its resources; and diminished sense of ethics and values on the part of its employees.

Suparna and Bellis (2001) argue that prolonged use of video display units (VDUs) can have a detrimental impact on users’ health. Sustained use of computer monitors can result in a number of conditions, including eyestrain; double vision and headaches; and neck and shoulder problems. Bowey (2006) concludes, on the basis of the findings of a multitude of research studies that excessive use of a computer keyboard, and other input devices, such as a mouse, can also lead to injuries to the arms, hands and fingers. This type of physical stress is commonly known as repetitive strain injury (RSI).

3.2 Discrimination at Work

Providing equal access to information systems for disabled groups is an important element of the implementation and management of IT systems in the workplace. These rights are enshrined in the law, primarily via the Human Rights Act, 1998 and Equality Act, 2010. The former states that individuals should not be discriminated against on “any ground such as sex, race, colour, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth or other status” (Gov.UK 2018c). The Equality Act places, upon an employer, the duty to make reasonable changes for disabled employees. These are known as ‘reasonable adjustments’. Adjustments should be made to avoid an individual being put at a disadvantage compared to non-disabled people. The Equality Act 2010 also provides legal rights for disabled people regarding access to goods, services and facilities.

Cultural and language barriers may result in unconscious bias, misunderstandings, conflicts and project failures. Treating people unfairly at work because of their cultural, racial and sexual orientation difference may even be unlawful under equal opportunity laws. In todays’ globalisation, organisations may span a variety of countries with different working cultures and different discrimination laws. Thus they must comply with national, European and international labour standards laid down by bodies such as the International Labour Organisation (ILO) (Rice-Birchall 2015).

Similarly gender discrimination continues to be an issue that is encountered by women in the workplace, such as sexual harassment and gender evaluation (the use of gender as a criterion for job-related decisions), which inevitable has negative impact on job-related outcomes (Shaffera et al. 2000). Some of the forms of discrimination discussed previously could be the result of Unconscious Bias. The BCS, The Chartered Institute for IT, has introduced an Unconscious Bias Training course, for all employees and Volunteer committee members. This includes a series of case studies to generate self-awareness (BCS UcB 2015). Case studies research carried out by Georgiadou et al. (2009) revealed that women are either not participating in It professions or when they participate they have minimal opportunities for career advancement. Such injustices are highly likely to result in process and systems failures which are detrimental to the individuals, the project, the organisation and society at large.

4 Heuristics

A number of heuristics are suggested below, which if followed may lead to ethical systems development and deployment guidance in the context of health and safety. Each rule of thumb is substantiated by citing one or a number of the ethical normative principles, listed in Sect. 2 above. Often there is a lack of relevant knowledge or inexperience of clients regarding health and safety and computers in the workplace. It is the computer professional’s duty to instruct in such circumstances.

Incorporate in the Design the Utilisation of Assistive Technologies:

as part of the system engineer’s brief is to oversee the development and installation of new hardware and software. This must be completed within the framework of public interest. The British Computer Society Code of Conduct (BCS CoC 2018) demands that computer professionals “have due regard for public health, privacy, security and wellbeing of others and the environment”; “have due regard for the legitimate rights of Third Parties” (includes any person or organisation that might be affected by your activities in your professional capacity, irrespective of whether they are directly aware or involved in those activities); and “promote equal access to the benefits of IT and seek to promote the inclusion of all sectors in society wherever opportunities arise”. This can be achieved through the incorporation of assistive technologies, for example, screen readers, refreshable braille display, eye gaze and head mouse systems, etc. In addition, the development of any web content application must be compliant with Web Content Accessibility Guidelines (WCAG) 2.1 (WC3 2017) which defines how to make Web content more accessible to people with disabilities.

  • Deontology (Pluralism): Justice

  • Deontology (Pluralism): Beneficence

  • Deontology (Pluralism): Non-injury

  • Deontology (Contractarianism): The right not to be discriminated against

  • Deontology (Contractarianism): The right to fair access to, and development of, communication resources

  • Teleology: Utilitarianism

  • Teleology: Altruism

  • The US Content Subcommittee of the ImpactCS Steering Committee: Quality of life

  • The US Content Subcommittee of the ImpactCS Steering Committee: Equity and access.

Ergonomic Design:

The development and installation of new systems must adhere to ergonomic design that looks to reduce strain, fatigue, and injuries by improving product design and workspace arrangement. The Health and Safety Executive (HSE 2013) advocated measures that an employer must take, in order to protect their employees from any risks associated with Display Screen Equipment (DSE). These recommendations ensured compliance with the Health and Safety (Display Screen Equipment) Regulations 1992. Guidance ranges from how to effectively arrange a workstation, through to users modifying their body mechanics and employees adjusting their work patterns. There is a moral duty on systems developers to have due regard for public health, and wellbeing of others and the environment in which their developed solutions are installed. Thus there exists an imperative on computer professionals to install solutions that comply with health and safety guidelines and instruct where there is a lack of relevant knowledge or inexperience of ergonomics (product design and workspace arrangement) in others, for example clients, for whom systems are being delivered.

  • Deontology (Pluralism): Beneficence

  • Deontology (Pluralism): Non-injury

  • Teleology: Utilitarianism

  • The US Content Subcommittee of the ImpactCS Steering Committee: Quality of life

  • The US Content Subcommittee of the ImpactCS Steering Committee: Risks and Reliability

Conduct an Operational Feasibility Study:

An operational feasibility study is the process of determining how a system will be accepted by people (assessing employee resistance to change, gaining managerial support for the system, providing sufficient motivation and training, and rationalising any conflicts with organisational norms and policies) and how well it will meet various system performance expectations (for example, response time for frequent online transactions, number of concurrent users it must support, reliability, and ease of use) (Stair and Reynolds 2017). There is an ethical duty for health and safety to be assessed, as an integral part of an operational feasibility study. In the first instance the study should determine how the system will be accepted by people with specific disabilities. This may imply dialogue between developers and trade union, health and safety, and disability representatives. These representatives have rights under the management regulations to be consulted by their employers and developers about anything affecting members’ health and safety, including the introduction and adoption of new technology. This may result in the negotiation of a policy for working with computers, akin to HSE guides. Secondly, there is a need interweave health and safety as part of the system performance expectations. For example, for every system a non-functional requirement that should be explicitly stated, thus contractual binding, should be the compliance with the Health and Safety (Display Screen Equipment) Regulations 1992.

  • Deontology (Pluralism): Beneficence

  • Deontology (Pluralism): Non-injury

  • Deontology (Contractarianism): The right to fair access to, and development of, communication resources

  • Teleology: Utilitarianism

  • The US Content Subcommittee of the ImpactCS Steering Committee: Quality of life

  • The US Content Subcommittee of the ImpactCS Steering Committee: Equity and access

Formulation of a Computer-Use Policy:

Organisations are continually being confronted with escalating liability with regards to employee use of electronic resources. In order to mitigate this risk of liability, companies need to develop and implement a computer-use policy, which explicitly outlines proper use of the organisations electronic resources (Cox et al. 2005). Employers that use monitoring technology face the possibility of creating an atmosphere of distrust in the workplace. An employee who feels no sense of trust from the employer lacks the incentive to be efficient and could be less productive. A balance needs to be struck between privacy needs with unrestricted control of computer usage, in other words, a point on the spectrum between the two extreme options of: do nothing or monitor everything. A computer-use policy must be formulated that explicitly states what the agreed behavior is regarding computer usage. The formulation process must commence with a consultation with their legal counsel and other relevant parties (for example, human resources, employees, and, if applicable, union representatives) to determine what type and scope of policy would be best suited for the organisation. The system engineers overseeing the installation of new hardware and software (computer resources) have an ethical duty to participate and contribute to this formulation process of a computer usage policy. Their technical expertise, understanding of the current functionality of the delivered system, will give invaluable insight into systems capabilities, thus enabling far more effective, and better, policy to be drafted for enforcement.

  • Deontology (Pluralism): Beneficence

  • Deontology (Pluralism): Non-injury

  • Deontology (Contractarianism): The right to Privacy

  • Deontology (Contractarianism): The right to Property

  • Principle of Informed Consent

  • The US Content Subcommittee of the ImpactCS Steering Committee: Quality of life

  • The US Content Subcommittee of the ImpactCS Steering Committee: Risks and Reliability

Conduct Risk Management:

In order that risks are managed effectively and efficiently, the hazards and effects associated with the implementation of computer systems have to be properly managed. At its very core risk management can be viewed as four stages: (1) Identify - Are people, environment or assets exposed to potential harm? (2) Assess - What are the causes and resulting concerns? What is the probability in the loss of control? What is the risk? (3) Control - Can the cause be eliminated? What controls are needed and how effective are they? (4) Recover - Can the potential consequences or effects be mitigated? What recovery measures are needed? Are recovery capabilities suitable and sufficient? In other words the hazards and effects should be identified; fully assessed, necessary controls provided and recover preparation measures put in place to control any hazard release (PMI 2013). Thus, a risk management plan needs to be prepared, typically as a joint effort between project manager and system engineers, in order to document foreseen risks, estimate impacts, and define responses to issues. In order that lessons are learned and Process Improvement is achieved a systematic recording and analysis of issues, errors, and failures must be carried out. In this all important process and document should be the health and safety concerns that have been identified above.

  • Deontology (Pluralism): Beneficence

  • Deontology (Pluralism): Non-injury

  • Teleology: Utilitarianism

  • The US Content Subcommittee of the ImpactCS Steering Committee: Quality of life

  • The US Content Subcommittee of the ImpactCS Steering Committee: Risks and Reliability

5 Conclusions

The rationale of applying the ethical framework presented in this paper was to identify and defend ethical stances that can be taken in the concerns over health and safety regarding newly deployed and existing systems. In doing so, the authors conclude that the importance of ethical considerations in the developing and delivering health and safety compliant systems can be bought to the attention of the systems development and software engineering community: providers, project managers, developers, engineers and clients, thus help raise the visibility of ethical use.

The paper contributes to the current ethical and philosophical discourse relating to the health and safety and wellbeing in the use of computers in organisations. In particular, a set of heuristics for the ethical health and safety guidance has been proposed which will raise awareness of the moral issues and help guide developers and users of computer systems. The development of a set of heuristics presented this paper is an important one. For majority of these suggested rules UK law clearly applies and leads directly to the appropriate ethical conclusion. But to rely solely on law as a moral guideline is clearly dangerous. There are instances where the relationship between law and ethics breaks down, and the law fails to provide moral guidance. Thus to solely rely on the law for guidance, to exclusively fulfil legal duties, may lead to occasions where an individual fails to accomplish their ethical responsibility.

Additional research could include interweaving the issues of health and safety into the systems development life cycle (SDLC). Thus at each stage of the process for planning, creating, testing, and deploying an information system, systems developers will be conscious of the duty they have to incorporate health and safety into the system’s specification and design. Further research in this field is needed relating to SME and micro companies where there are few if any computer professionals being employed, despite IT being a key component to the survival of the business.

The notions of health and safety, wellbeing and ethical duty need to be explicitly addressed in the SPI Manifesto. Although these are implicitly implied in the manifesto’s three values and ten respective principles, there needs to be a much more unequivocal statement with regards to how these notions must govern personal behaviour in relation to Software Process Improvement work. Thus an eleventh principle could be appended to the SPI Manifesto: To Fulfil Ethical Duties

The focus of this paper has been in the delivery of new systems that must be health and safety compliant for the recipient, the client. However, it should also be noted that system developers, in turn, are, themselves, employees. Thus in their everyday working lives, they should also be entitled to work in environments that are conducive to good health and wellbeing.