Keywords

Thinking systemically is inherently collaborative.(Williams & Hummelbrunner, 2010, p. vii)

Systems Research: What and Why

The concept of systems research could be seen to have at least two distinct though related meanings: doing research from a systems perspective or doing research into the nature of systems. In offering a text on systems research, the authors seek, first, to offer a framework and approach that will be relevant from either standpoint, and perhaps also facilitate greater integration between the two. Second, we have organized this book to provide a comprehensive overview of theory, practice, and methodology relevant to such an approach.

Defining Research

At the most basic, all research might be seen as gathering information to inform action. Ultimately, it is part of a circular process of ongoing learning, based on previously obtained knowledge and experience. The scientific method involves, first, the recognition of a particular area of interest—a problem, situation, event, physical phenomenon, and so forth—that requires explanation or better understanding. The next steps represent the elaboration of a plan for gathering information about the chosen focus, generating an hypothesis to inform rationale and process for putting the plan into action, and gathering data, which then must be analyzed to elicit at least a tentative explanation of the phenomenon under investigation. This result will then inform future research, as well as any actions taken on the basis of what was learned.

This cycle of observation, reflection, planning, and acting (see Fig. 1.1) is at the heart of the framework that will be presented in this book as an organizing metatheory for understanding the nature of systems, as well as the significance and potential of a systemic approach to research (essentially ontological and epistemological considerations respectively).

Fig. 1.1
figure 1

Basic systems research framework

The Emergence of Systems Thinking

The multifaceted systems field emerged in the mid-twentieth century out of a growing recognition of the limitations of traditional approaches to scientific research, specifically in terms of the mechanistic and reductionist assumptions at the foundation of modern science since Descartes and Newton. Part of the legacy of this orientation has been the creation of a divide between “natural” science and “social” science. This divide, between what C. P. Snow (1959) identified as the “two cultures,” takes on many forms depending on the context, and might be seen as part of the problem focus for this book as a collaborative systems research project.

The broad ranging scope of developments in the systems field reflects a variety of impulses and commitments. Among the most significant for our purposes in this collaborative project is the recognition of the fragmented nature of discipline-based research and the need for a more integrated approach. The reductionist orientation of traditional science has been enormously successful in elaborating mechanisms of natural phenomena, expanding humanity’s collective understanding of the universe within which we find ourselves, as well as our ability to manipulate our environment in ways that most would agree have benefitted the human species enormously (at least some of them), although this success has often come at some cost to the “whole system” (environment, other species, and perhaps the long-term viability of human habitation on the planet).

Traditional approaches to research require a narrowing of focus, in the spirit of Descartes, isolating a small part of the problem/situation/phenomenon (i.e., the system, using the term inclusively to encompass any kind of entity that can be studied), in order to understand its behavior under varying conditions. This calls to mind the often-quoted maxim, “all other things being equal.” Classical science has had a tendency to marginalize and trivialize those “other things.” And perhaps most critical among those other things is the role of subjectivity and agency that play such an important role in the social science side of the divide.

In seeking to understand the nature of systems, traditional science maintained an attitude of detachment and objectivity, until Heisenberg (1930) demonstrated that the observer cannot be separated from the observed. We as researchers (observers of nature) are embedded in the phenomena we seek to understand. We bring our biases, assumptions, and motivations, as well as the constraints imposed by the environment within which we conduct our research.

One’s perception of the nature of reality determines the selection of data—what will be included and what will be excluded, the methodology for gathering the data, and the interpretation and meaning that will be drawn from the data. The motivation of the researcher (the purpose of the research) also informs the selection of data and the kinds of learning and/or action that will result.

In natural science, both theoretical and applied, the perception of reality as mechanistic is generally an unquestioned assumption. There is no room in this worldview for agency, purpose, or intelligence (other than human of course, although the question of how that evolved and functions in a mechanistic universe is never sufficiently explained). However, seen from a cyclical rather than a static perspective, information sharing, communication and learning are an integral part of the evolutionary process, which can be observed even at the atomic and molecular level.

Implications for Research

Returning to the initial distinction between two different ways of defining systems research, the motivation for research in the natural sciences is generally oriented around research into the nature of systems. The purpose of such research is to build on the body of knowledge in a particular discipline and, eventually, to apply that knowledge to a particular end. The distinction between theoretical or “pure” science and applied science results from the institutional structure within which these pursuits are—to a large extent—isolated from one another, not to mention the social and environmental contexts within which scientific research is both conducted and applied.

In the social sciences, and the biological sciences as well, research is still concerned with the nature of systems, although the greater role of the environment in social and biological systems requires a somewhat different approach. One might see natural law as a constraining environment for physical systems, but this context remains essentially unchanging. In seeking to understand the behavior of living and other complex systems, the environment emerges as a critical factor, the processes of feedback and learning play a much more pivotal role, and the reductionist paradigm becomes increasingly inadequate. Indeed, it is in connection with his research in the biological sciences that Ludwig von Bertalanffy initially proposed the concept of general systems theory in the early twentieth century.Footnote 1 The emergence of the ecosystem concept (Tansley, 1935) and the subsequent growth of ecology as a scientific field of study during this same time period reflect the growing awareness of the importance of considering the “environment” as popularly understood.

Further complications in studying the nature of human systems are the roles that subjectivity and objectivity play in the behavior of human actors in the system. These distinctions are probably the most critical factors in creating the divide between natural and social science. The commitment to objectivity in the former precludes consideration of consciousness, interpretation, meaning, motivation, purpose, and so forth, within the systems being studied. Of course, these dimensions are recognized as embodied within the researchers themselves and are clearly present within the process of conducting the research, yet they are not considered as relevant to the research into the system itself.

It is within the context of the social and biological sciences that doing research from a systemic perspective or orientation becomes more compelling, although this approach is ultimately relevant in the physical sciences as well. This systemic orientation requires a broadening of focus to include whole systems, with the recognition that any research also requires a clearly defined and bounded system. Thus the researcher must seek to be as inclusive as possible in relation to the focus of the research, while acknowledging and providing a clear rationale for the delineation of a particular boundary, and being aware of potential influences from outside the boundaries of the system thus identified. These kinds of considerations inform the emergence of the concept of holons or the holarchic nature of reality, originally introduced by Arthur Koestler (1970) to describe the concept of a multilayered structure of systems within systems. This multilayered structure is also described in terms of hierarchy, although that term is often understood to imply hierarchies of power, which is not necessarily the case in the holarchic sense.

Systems and Circularity

A systemic approach to research requires a much more robust examination of the interrelationships among the various components of the system being studied, as well as between the system and the larger environment. As previously stated, living systems are characterized by feedback and learning—which are circular or nonlinear processes—and function according to the basic systems research framework outlined above—observe, reflect, plan, and act. Although these terms imply an anthropomorphic connotation, they can be reconceived in ways that are relevant to both physical and biological systems, without changing the essential nature of the framework. The cycle, as thus elaborated, implies some level of decision-making at all levels of the system, in response to both internal interactions and external information and constraints. This decision-making process can be unconscious and predetermined (in most physical and biological systems) or subject to conscious evaluation and choice (in most human systems).

The systems research framework that is being introduced in this volume is based on the work of Robert Rosen (1958) on relational theory, which was further developed by one of our co-authors, John J. Kineman (2011, 2012). The emphasis on relation is key. Joanna Macy (1991) provided some useful insights on the nature of this relation in her comprehensive discussion of mutual causality, comparing Buddhism and systems theory in articulating the concept of dependent co-arising. A systemic orientation need not appeal to external intelligence or a supernatural designer to account for purpose or intelligence within the evolutionary cycle.

Instead, a systems orientation to understanding the nature of reality highlights interrelationship, mutual causality, and the potential for the emergence of novelty, which is not necessarily predictable. This perspective places the researcher back in the system as an integral part of the system, not as an objective external observer. Essentially, it reinforces the conception of a participatory universe, articulated by John Wheeler (1994) in connection with his work on quantum mechanics.

The systems view of reality, along with the related notion of a participatory universe, has important philosophical implications. The purpose of this chapter is to articulate ontological, epistemological, and ethical considerations in conducting research—both into the nature of systems and from a systemic orientation. In order to provide some context, it will be helpful to begin with some background on the emergence of systems ideas.

Conceptualizing Systems

The concept of system as an organizing framework for scientific research emerged in the mid-twentieth century, growing out of a number of parallel and related developments in theoretical and applied science. The Newtonian framework, which had guided scientific inquiry for three centuries, was initially challenged by developments in physics, the iconic discipline of classical science. In exposing the limitations of the mechanistic and reductionist orientation inherent in that approach, relativity theory and quantum mechanics transformed humanity’s collective understanding of matter, energy, and time as less rigidly fixed than previously conceived. More importantly, these theories called into question reigning assumptions about predictability, determinism, and scientific objectivity. The observer could no longer be seen as outside and separate from the phenomena being observed.

Developments in the biological sciences—the emerging understanding of feedback processes and the concept of living organisms as open systems—highlighted the need for a new conceptual framework to adequately address the complexity of these systems. Generally recognized as the “father” of general systems theory, Ludwig von Bertalanffy proposed the concept of organismic biology in the early twentieth century as an alternative to the mechanistic paradigm, then dominant in the life sciences. Arguing that the laws of physics and chemistry were insufficient to explain the complex organization in living systems, he believed that the laws of organization were emergent properties that could be studied scientifically. Perhaps his most important contribution to the evolution of systems ideas, the concept of open systems highlighted the capacity for self-organization, creativity, and spontaneity in the behavior and evolution of living systems.

Many of the insights emerging in the biological sciences in the early twentieth century were echoed in the engineering sciences, and, in fact, there was considerable cross-fertilization between these two fields (see Haraway, 1976; Weiss, 1939). In seeking to understand complex patterns of organization, interrelationship, and developmental change, biologists often drew analogies from mechanical systems. As engineering became increasingly sophisticated, the models and metaphors for understanding living systems evolved accordingly, from mechanical levers and pulleys, as in seventeenth century descriptions of circulation in the body, to conceptualizing living organisms as information processing systems in the twentieth century. Notable in this regard is the work of Paul Weiss (1939, 1973), who applied systems concepts from engineering to explain organizational processes in embryology, which shaped the development of von Bertalanffy’s thought (see Haraway, 1976; Hammond, 2003).

A critical dimension in understanding organizational patterns and processes—in both living and sophisticated technological systems—is a recognition of the important role of feedback processes and circular, or nonlinear, causal relations. Articulating the processes of homeostasis in living organisms by drawing on the earlier works of French physiologist Claude Bernard (1865), Lawrence Henderson (1913) and Walter Cannon (1932) reinforced a more holistic approach to understanding both biological and social phenomena. A related development that contributed to a growing emphasis on the importance of information and communication in complex systems was Claude Shannon and Warren Weaver’s (1949) elaboration of information theory in The Mathematical Theory of Communication.

Perhaps the most significant example of cross-fertilization between these emerging systems-oriented sciences is the series of 10 conferences on what came to be known as cybernetics, hosted by the Macy Foundation between 1946 and 1953 (see Heims, 1991). The motivation for convening the conferences was the recognition of similar patterns of self-corrective feedback processes in a broad range of disciplines, and they brought together researchers from fields as diverse as mathematics, physics, engineering, computer science, neurophysiology, psychology and psychiatry, anthropology, sociology, and philosophy.

In their seminal paper on “Behavior, Purpose and Teleology,” which provided the initial impetus for the conferences, Arturo Rosenblueth, Norbert Wiener, and Julian Bigelow (1943) suggested that “all purposive behavior [emphasis added] may be considered to require negative feedback” (p. 19) thus providing a lens through which non-mechanistic aspects of system behavior might be incorporated. The processes of feedback came to be seen as the basis for self-regulation and self-organization in complex systems.

Gregory Bateson (1972), a member of the cybernetics group, described “the subject matter of cybernetics [as] not events and objects, but the information ‘carried’ by events and objects” (pp. 401–402). Perhaps even more presciently, Norbert Wiener, who popularized the term in his 1948 book, Cybernetics: Or Control and Communication in the Animal and the Machine, wrote:

It is the thesis of this book that society can only be understood through a study of the messages and communication facilities which belong to it; and that, in the future development of these messages and communication facilities, messages between man and machines, between machines and man, and between machine and machine are destined to play an ever-increasing part. (p. 16)

Although not a member of the original group, Stafford Beer (1966), an active member of the American Society for Cybernetics,Footnote 2 echoed this theme, providing a useful transition from the theoretical to the applied sciences in his application of cybernetics to problems in management: “Cybernetics is the science of effective organization. It studies the flow of information round a system, and the way in which this information is used by the system as a means of controlling itself” (p. 254).

In parallel with theoretical developments in the physical and life sciences, emerging technologies in the energy, transportation, and communication sectors fostered an unprecedented growth of large-scale organizational structures in both public and private sectors (Boulding, 1953). Operating at the interface between human, technological, and ecological systems, these organizations required a far more sophisticated approach to coordinating the various components of their operations.

Understanding the nature and source of organization in complex systems became increasingly critical in the wake of the technological revolutions that so profoundly transformed the nature of human existence. Applying that understanding in the design of both technological and human systems emerged as one of the key aims of developments in the systems field, with a proliferation of methodologies for applying systems insights in addressing the increasingly intractable problems confronting humanity.

In discussing the emerging field, von Bertalanffy (1968) identified three distinct orientations: systems technology, systems science, and systems philosophy, which he believed entailed unique perspectives, approaches and, occasionally, mutually incompatible commitments. More recently, in his Bertalanffy lecture at the 2014 Annual Meeting of the International Society for the Systems Sciences (ISSS), David Rousseau expanded the systems technology orientation to encompass systems design. Either formulation highlights the dialectic between theory and practice, suggesting a potential role for systems research as mediator between the various orientations, in fostering a more systemic orientation and facilitating greater integration across disciplinary boundaries. In order to explore the nature of this role, it is necessary to articulate what is meant by a systems approach and what might be the common assumptions across the range of systems approaches, both theoretical and applied. To that end, a brief overview of the history of systems thinking will provide some context to address these questions.

Evolution of the Systems Field

Both von Bertalanffy’s (1968) and Rousseau’s (2014) articulation of the various categories of systems thinking—technology/design, science, and philosophy— provide a framework for exploring the evolution of the field. It also begs the question of the distinction between various types of systems. Although these categories are somewhat fluid, it might be useful to identify the following five distinct typesFootnote 3:

  • physical systems;

  • technological systems;

  • living systems, including both individual organisms and ecological communities;

  • human/social systems: economic, political, educational, medical, and so forth;

  • symbolic systems.

Although the three subfields—technology/design, science, and philosophy—emerged more or less simultaneously, they developed along relatively independent trajectories, nevertheless with a certain amount of cross-fertilization. Beginning with systems technology and design, which might also be described as “applied” systems sciences, the following section will provide a brief schematic summary of the developments in this area. Although closely related and often mutually influential, systems applications in technological systems can be distinguished from the application of systems concepts in the organization and management of social systems.

Applied Systems Approaches: Technology and Design

In looking at the applications of systems approaches in technological systems, it is helpful to distinguish between systems engineering, which deals primarily with the technological dimensions of a system, and the related fields of systems analysis, operations research, and management science, which deal more directly with the organization and management of both human and technological dimensions of evolving complex organizational structures.

Systems engineering can be defined as the design, development, production, and operation of large complex physical systems. The origin of systems engineering is generally traced to Bell Labs in the early 1940s, and—perhaps by necessity—the field tended to be somewhat more “systemic” from the very beginning than parallel developments in organizational management. Complex engineering projects required a comprehensive analysis of the system as a whole, with input from and ongoing evaluation of the system in relation to its environment, including the human systems involved in the production and eventual use of the product (Hall, 1962).

This process follows the basic format of the cyclical framework proposed above, although expanded into seven steps, as outlined by the International Council on Systems Engineering (INCOSE, n.d.): “State the problem, Investigate alternatives, Model the system, Integrate, Launch the system, Assess performance, and Re-evaluate” (“What is Systems Engineering,” para. 4; see also Chapter 8, Appendix “Systems Engineering”).

As technologies, and thus the organizational structures involved in their implementation, became increasingly complex, the application of systems approaches can be seen in techniques for optimizing decisions (systems analysis), coordinating logistics (operations research), and managing human participants in the systems (management science). Clearly, these three areas are closely interconnected and these definitions should be seen as broad and overlapping generalizations.

Initially, these three fields tended to draw on and operate according to fairly mechanistic principles and procedures and, along with systems engineering, are often referred to as “hard” systems approaches. This was primarily because they did not adequately account for the actual experience of the individuals involved in the system’s functioning, but instead tended to portray the systemic relationships in objective and quantitative terms.

In his discussion of systemic methodology, Gerald Midgley (2000) identified three waves of systemic inquiry that reflect a shift in focus from systems technology to a more collaborative process of systems design and a corresponding transition from “hard” to “soft” systems methodologies. He described the first wave as emerging out of a confluence of developments in the first half of the twentieth century, including scientific management, human relations, operations research, and action research. It is important to note here that action research, initially introduced by Kurt Lewin, was unique in seeking input from all relevant members of the system under investigation (see Reason & Bradbury, 2008).

Emerging in the 1970s, the second wave in the evolution of applied systems, often described as soft systems approaches, integrated a more explicit focus on the human experiential dimension, recognizing the significance of meaning and purpose in human activity systems, and emphasizing the importance of including relevant stakeholders in the process of inquiry and decision making. Related developments included, among others:

  • inquiring systems design, based on the work of West Churchman (1971);

  • soft-systems methodology, developed by Peter Checkland (1981); and

  • interactive management, articulated by Russell Ackoff (1974).

Drawing on insights gained from these initiatives, the third wave Midgley identified is the “critical systems” approach, which began in the 1980s, drawing on Werner Ulrich’s (1983) critical systems heuristics in addressing issues of power relationships in organizations and adopting a more overtly emancipatory orientation. This approach is reflected in the works of Robert Flood and Mike Jackson (1991) and Midgley (1995). These developments in the systems technology and design field informed the theoretical orientation of systems science (see Hammond, 2014 for a more comprehensive discussion of applied systems theory).

Systems Science: Understanding the Nature of Systems

In parallel with these efforts to manage increasingly complex technological and organizational systems, three primary fields emerged in the 1950s and 1960s, with a more theoretical emphasis on articulating the dynamics of complex systems:

  • cybernetics, which grew out of the Macy conferences of the 1940s and 1950s;

  • general systems theory, initially proposed by von Bertalanffy and developed in the context of the Society for General Systems Research in the 1950s, and

  • system dynamics, which built on the work of Jay Forrester in the 1960s.

Cybernetics grew out of the recognition of nonlinear or circular causality, exploring the role of positive and negative feedback in biological, technological, and social systems, particularly in terms of information flows. An understanding of feedback processes provided insights into the structural relationships of complex systems and helped to explain the operation of self-organization in human, technological, and natural systems. As the field evolved, there was more of an emphasis on what became known as second order cybernetics, drawing on the work of Heinz von Foerster (1974) and Gregory Bateson (1972), which highlighted the significance of the observer and the role of consciousness, cognition, perception, meaning-making, and self-reflexivity.

The field of system dynamics, based on the work of Jay Forrester (1961), was also concerned with positive and negative feedback, although less in terms of information flows and more in terms of the internal dynamics of a system, which could be modeled using causal loop diagrams. In addition, systems dynamics sought to explain the material stocks and flows in a system. In contrast to the field of cybernetics, system dynamics tended to reinforce a more objective approach to understanding and managing complex systems (see Richardson, 1991).

General system theory grew out of a much broader orientation than either of the other two fields, as it sought to identify general principles that characterized complex systems across the disciplinary spectrum. The concept of feedback, or nonlinear causality, was clearly significant in this regard, as were such concepts as emergence, the hierarchical (or holarchic) organization of complex systems, the capacity for self-organization and learning, and the role of perception, interpretation, meaning, and purpose in human systems.

Systems Philosophy: Implications for Research

It was the significance of systems philosophy about which von Bertalanffy was perhaps most passionate. He saw systems theory as providing an alternative to the mechanistic models dominating the science of his times. For him, the mechanistic worldview could be blamed for many of the evils plaguing the world, particularly in relation to what he called the “robot model” of humanity. In fact, he believed that systems theory offered a new way of conceptualizing reality that honored the autonomy and creativity of living systems.

In the introduction to Systems Concepts in Action: A Practitioner’s Toolkit (2010), Bob Williams and Richard Hummelbrunner begin with a discussion of three primary orientations that they believe characterize a systems approach:

  • An understanding of interrelationships

  • A commitment to multiple perspectives

  • An awareness of boundaries (p. 3).

In broad terms, these characteristics might be seen as reflecting the ontological, epistemological, and ethical implications of a systems view respectively: systems ontology concerning itself with the dynamics of relationships within a system and between the system and its environment; systems epistemology necessitating a more inclusive understanding from viewpoints both within and from outside of the system, rather than from a single “objective” observer point of view; and systems ethics that is building on this inclusivity, and reinforcing a much broader consideration of actors within and outside of a system.

Ontological Considerations

With regard to the ontology of systems, there are two questions to consider: one focusing on the ontology of a system (i.e., what is a system?), which corresponds with research into the nature of systems. The second question focuses on a systems ontology (i.e., what is the nature of reality from a systems orientation?), which is relevant in the process of doing research from a systemic perspective.

In addressing the first question, it is important to understand that a system is not so much a “thing” as a process. This approach resonates with Process Philosophy, introduced by Alfred North Whitehead (1929), who worked closely with Henderson and Cannon. All three scholars had considerable influence in the evolution of certain branches of systems theory (see Miller, 1978). The emphasis in this view is on change—the process of becoming, rather than static states of being. It portrays the nature of reality as a continual flow of matter, energy, and information.

Building on the work of John J. Kineman (2011, 2012), the authors of this volume adopted the four-quadrant shared framework, which articulates an evolutionary progression through a cycle of observation, reflection, planning, and action. The nature of this systems research framework is dynamic and highlights the ontology (being-ness) of a system as process, embedded in interactive patterns of relationship. The cyclical progression illustrates the evolutionary potential of feedback processes, as the system responds to inputs from the environment as well as changes in its own internal dynamics resulting from previous action.

Research into the nature of systems involves an articulation of the mechanisms involved in a particular system’s behavior; in essence, it is a search for an underlying causal explanation. In seeking to understand a system, questions of ontology ultimately involve questions of history. The epitome of this kind of focus is research into the origin of the universe. Moving in the opposite direction around the four-quadrant framework, one can begin with the universe as the focus for the investigation, and seek to explain the dynamics that account for the observed phenomena. This leads to the discovery of certain patterns and laws that inform the dynamics of the system, which—though not necessarily conscious or purposive—constrain the available options in the evolution of the system.

The activities identified in each of the four quadrants of this framework reflect the four causes initially proposed by Aristotle (see Falcon, 2012):

  • Observation: identification of the material system—material cause;

  • Action: identification of the dynamics of the system—efficient cause;

  • Planning: the constraints operating in the choice of action, whether conscious or not—formal cause;

  • Reflection: building on prior evolutionary states, the condition from which the other causes flow—final cause.

Although the latter two categories have been trivialized and deemed irrelevant in modern science, understanding the four causes in a cyclical rather than a linear progression provides insights into the evolution and mechanisms of physical systems, as well as technological, living, human/social, and symbolic systems. One can discover the chemical composition and structure of a rock, for example, but it takes geological analysis (including the location of the particular sample) to explain its particular history and how it came to be what it is.

In my own work, “Philosophical and Ethical Foundations of Systems Thinking” (Hammond, 2005), I explored the second question regarding systems ontology. Beginning with an emphasis on the holistic nature of reality and the importance of considering relationships, both among the components of a system and with the larger environment, a systems-oriented ontology highlights organization, interaction, and interdependence, shifting from the atomistic and individualistic orientation of the mechanistic worldview to a more organic conception of nature and an appreciation of the patterns and processes of relationship.

In addition to the phenomenon of feedback, the concept of emergence is central in understanding the implications of a systemic worldview. In the simplest terms, the concept of emergence suggests that the whole is more than the sum of its parts, or that systems cannot be understood nor their behavior predicted based solely on information relating to the individual parts. Through the interaction of the individual components, novel qualities and phenomena emerge. In contrast to the analytical orientation of classical science, a systemic approach engenders a consideration of whole systems.

Growing out of this awareness, another key concept is an appreciation for the hierarchical or holarchic organization of complex systems. Just as systems cannot be understood by examining the individual parts, it is essential to understand systems in the context of their environment; hence, system and environment comprise an interactive process. From this perspective, there are many levels of organization within complex systems. The constituent parts of a system at one level are often complex systems themselves, embedded in the environment of the higher-level system, and containing their own interacting components.

It is the interactive process between the system and its environment and the dynamics of feedback that result from this interaction that nurtures the emergence of sophisticated properties that characterize complex systems, such as the capacity for learning and self-organization. In the context of human systems, this highlights the role of perception, interpretation, meaning, and purpose as an integral part of the system, which are critical to an understanding of epistemological and ethical implications of a systems orientation.

Epistemological Insights

An essential starting point for a systemic epistemology, and thus for research from a systemic orientation, is the recognition of the observer as an integral part of the system, which is a departure from the classical assumption of a neutral objective standpoint outside of the system. This is particularly important when dealing with human systems where, as Kenneth Boulding (1956b, 1968) has pointed out, knowledge of the system becomes an important part of the system. This is actually true in relation to physical, technological, and biological systems as well, which might be most easily seen in the evolution of computer technology. Further, in the process of observing natural systems, an observer brings assumptions, biases, and motivations that influence the process of observation.

From a systems perspective, knowledge is a dynamic and dialectical process of interacting with a system. The following are some questions and considerations that a systems-oriented researcher might want to consider as a starting point:

  • What is my own relationship with the system I intend to study?

  • What conceptual framework is guiding my choice of research topic?

  • What assumptions, beliefs, and values am I bringing to the research?

  • What do I hope to learn?

  • What impact will my research have on the system?

  • What possible blind spots might I need to consider?

  • How might I gain insights from the system itself?

The last question is particularly relevant in connection with human systems, although a systemic epistemology highlights the need to consider multiple perspectives in research into any kind of system, where these questions might be expanded to address the following considerations:

  • What might I learn from other disciplinary perspectives?

  • What aspects of the system’s environment might be relevant to my research?

  • How will my research affect the larger social or ecological environment of the system?

The epistemological dimension is reflected in the two right hand quadrants of our shared framework—observe and reflect—which then imply a further iteration of planning and action. The shared framework thus transcends the traditional separation between theory and practice and supports a more collaborative approach to research. The appreciation for the pluralistic and participatory nature of systemic knowledge as an evolutionary process of perception, interpretation, and creation of meaning, has nurtured the development of systems methodologies with an explicitly ethical commitment to inclusivity.

Ethical Implications

A fundamental orientation in systemic research is a consideration of purpose as an integral part of the research process. Based on his understanding of human systems as purposeful systems, composed of purposeful parts, and also part of larger purposeful systems, Russell Ackoff (1974) described the challenge of management as designing human systems in ways that can “serve their own purposes, the purposes of the purposeful parts, and the purposes of the larger systems of which they are a part” (p. 18).

The questions posed in the previous section challenge the systems-oriented researcher to consider the possible implications of their research in relation to the purposes of both the purposeful parts of the system, as well as the purposes of the larger system. This latter concern is equally relevant in nonhuman systems. Engaging the question of purpose illuminates some key principles of a systemic ethic. Recognizing the embeddedness of both research and researcher in a larger social and ecological context, it is important to understand the possible ramifications of the research project in the larger system. Perhaps some additional questions to be considered are:

  • Whose interests does the research serve?

  • Are there aspects of the system that might be negatively impacted by my research?

  • What are my own motivations in doing the research?

Considering a systems-oriented research project in the context of the larger environment recalls the concept of a participatory universe. As an integral part of the universe so conceived, one might consider systems research not as something done to a system, but rather conducted in partnership with a system. This is clearly evident in the participatory methodologies that have emerged in the context of social systems, with an emphasis on collaborative design processes. It is somewhat more challenging to consider what it might mean in relation to nonhuman systems.

In order to address this question, it is helpful to consider a classification of ethical orientations introduced by Carolyn Merchant (1992). Initially, she proposed three ethical orientations: egocentric, homocentric, and eco-centric. Clearly, the first orientation is focused solely on considerations of personal benefit, while the second takes into account the interests of humanity as a whole, and the third proposes concern with the larger ecological context. In her later work, Merchant (2003) expanded these categories to include a fourth category of partnership ethics, which she described as grounded in the “concept of relation” (p. 223).

Riane Eisler (2003) has also popularized the concept of partnership systems in contrast to dominator systems. According to the Center for Partnership Studies (n.d.), which promotes a cultural transition toward more collaborative ways of relating to one another:

There are two fundamental ways of organizing beliefs and institutions: the partnership system and the domination system. The degree to which a society or organization orients to the domination or partnership side of the partnership-domination continuum profoundly affects how we relate to ourselves, one another, and nature. (“The Domination-Partnership Systems Continuum,” para. 2)

It is within this perspective that we might consider the implications of a participatory ethic in nonhuman systems. West Churchman, former President of the ISSS, offered some compelling observations in this regard. Described by Robert Flood (1999) as the moral conscience of the systems field, Churchman believed that science should address itself to the serious problems confronting humanity, and further that scientists should be responsible for the social (and I would suggest also ecological) consequences of their discoveries (pp. 61–68).

An important example in the physical sciences that embodies this orientation is the emergence of the relatively new field of green chemistry, which is defined by the U.S. Environmental Protection Agency (n.d.) as “the design of chemical products and processes that reduce or eliminate the generation of hazardous substances” (“Green Chemistry,” para. 1). Noting the systemic interrelationship of developments in this field, the agency goes on to state that the “EPA’s efforts to speed the adoption of this revolutionary and diverse discipline have led to significant environmental benefits, innovation and a strengthened economy” (“Green Chemistry,” para. 1).

One of the most critical ethical considerations is the question of boundaries; good systems research is broadly inclusive. It must be clear about the reasons for the boundaries it draws around the system under consideration, what is being left out, and possible consequences of those choices. Ultimately, good systems research supports the cultivation of whole systems thinking. Good systems research seeks to nurture the health and integrity of the systems it serves and to manage the systems that structure our lives in ways that honor the needs and purposes of all participants in the system, as well as the larger environment within which that system functions.

Concluding Reflections on Systems Research

Traditional research, in the spirit of Sir Francis Bacon, sought to understand the world in order to be better able to predict and control the external environment, assumed a posture of detachment in relation to the phenomena under observation, and presumed the existence—and aspired to the mastery—of a stable objective truth. This assumption of objectivity marginalized considerations of values and subjective experience. A systemic approach eliminates the separation between knowledge and action, and calls for a much more inclusive and comprehensive orientation, encompassing a multidimensional analysis—scientific, sociopolitical, economic, environmental, and so forth— and the inclusion of all relevant stakeholders in the determination of future actions.

Ultimately, a systemic orientation to research might be seen as nurturing a transition from control to collaboration, from competitive relationships to a greater recognition of interdependence, from hierarchical to participatory decision-making processes, and from objectivity to reflexive self-awareness.

As the world becomes increasingly complex and human systems increasingly interdependent, it is essential that humanity learns how to manage the organizations that structure our lives in ways that honor the needs and purposes of all participants in the system, as well as the larger environment within which that system functions. While traditional discipline-based research provides a foundation for whole system understanding and effective action, it lacks an adequate model for integrating the fragmented pieces into a coherent whole. Systems research provides a framework for meaningful multidimensional synthesis of the situation or problem under consideration and, as much as possible, integrates perspectives from all aspects of the system.

In the chapters that follow, this approach will be elaborated in greater depth. The next chapter will provide a comprehensive overview of the four-quadrant framework that informs this collaborative work. Chapter 3 provides guidelines for structuring research problems and developing an effective research design. Chapter 4 explores the use of models in structuring the research process, organizing data, and understanding the system being studied. Chapter 5 articulates various methodologies for carrying out the research. Chapter 6 outlines approaches to reporting research. Chapter 7 examines the competencies required for good systems research, and the final Chapter 8 provides guidelines for evaluating research.