Keywords

Teaching a way of thinking is harder than teaching substantive factual material.(Checkland, 1999, p. A42)

Introduction

This guide has described how systems researchers must account for the philosophical underpinnings of their inquiry, how a systems framework can inform research work, and how to design research that is both systemic and systematic. We have discussed how modeling can be used in systems research, how researchers take action to conduct a study in effective ways, and how to communicate the outcomes of systems research. All of these are important stages systems researchers must move through in the course of a research project. This chapter does not address a specific stage in the systems research process. Rather, it focuses on competencies a systems researcher must possess in order to embark on systems research at all. We consider the competencies described here to be foundational—necessary regardless of one’s favored systems tradition or methodology: systems dynamics or cybernetics, critical systems heuristics or systems engineering, soft systems methodology or inquiry into complex adaptive systems. Abundant research on each of these approaches to systems inquiry has addressed skills particular to each. In this chapter, we seek instead to identify competencies required of any individual who wishes to conduct research about a systemic phenomenon, regardless of the systems traditions or methods that inform that research. While the literature on competencies often stresses behavioral techniques to be mastered and action steps to be taken in a given situation, we will focus here on perceptual competencies—cognitive and affective abilities we consider crucial in order to perceive systems and navigate the challenges of conducting rigorous systems research.

Knowledge about systemic phenomena is important, given the complexity of the gravest problems and greatest challenges we face in the twenty-first century. Systems researchers play an important role in meeting those problems and challenges. Over time, a large body of knowledge has amassed enabling us to understand the ontological workings of a system. However, systems are not apparent to everyone. Perceiving a system amidst disparate people, behaviors, objects, and events spread over time and space is a considerable mental leap, one regularly taken by systems researchers. What competencies are required in order to do this? What competencies are required of those wanting to investigate systems in a rigorous way? To date, the cognitive competencies required for perceiving systemic interconnectedness, the affective competencies required for skilled decision-making when working with systemic phenomena—are largely unknown. Our intention in this chapter is to suggest key perceptual competencies necessary to do systems research.

Competency in Perceiving the Qualities of Parts and Wholes

Measurement is not all there is to scientific activity.(Morin, 2015, p. 237)

Since its beginnings as a school of scholarly thought, systems theorists have taken a stand against reductionism as the only philosophical stance fruitful for scientific inquiry. As a governing logic for science, reductionism informed generations of researchers to divide complex phenomena into parts, and to categorize and measure them. It has guided us to understand large, complicated entities by searching for simpler, basic entities within them. However, its success as a governing logic has come to mean reductionism is seen as the legitimate way people should utilize their minds to conduct research. It has come to mean that scientific inquiry ought to focus primarily on certain qualities of a phenomenon—“the quantifiable and the mathematizeable” (Goodwin, 1999)—deeming the non-measurable qualities of phenomena that exist in our world less significant. Science has evolved to handle “primary” (i.e., measurable) phenomena fairly efficiently. However, the phenomenal properties of experience (Tye, 2015)—the subjectively experienced qualities of life termed “qualia” (C. Lewis, 1929)—science considers “secondary” (Locke, 1995). Thus, commitment to reductive logic has impoverished our understanding of many interesting questions: What are the humanly experienced properties of climate change? Of individualism? Of information technologies? Of health? Of refugee crises? Of spiritual belief? The organizing scheme of science is a vantage point with distinct disadvantages.

Since Galileo, the methodology of science has emphasized the study of number and measure (Goodwin, 1999). To measure something, one must delineate it from other things. Thus, Western science has evolved as a pursuit of knowledge through “analytical consciousness, which emphasizes distinction, separation, and causality” (Bortoft, 1997, p. 91). Analytical consciousness is indispensable for quantitative reasoning. Analysis is oriented toward counting parts (which necessitates separating that which is to be counted). Thus, scientists become highly skilled at perceiving separateness. In its enthusiasm for measuring quantities, science tends to de-emphasize the skill involved in identifying the qualities of phenomena. In particular, the ability to perceive qualities of wholeness—the ways elements of a phenomenon belong together in relationship—gets obscured by analytical consciousness.

And yet, humans (scientists included) experience the non-separated ways the world operates every day—from traffic jams to the intricacies of workplace politics. Humans perceive the ways interconnected phenomena go together in wholes by a psychological process Carl Jung termed “intuition” (Jung, 1971). In a universe that involves both quantities and qualities, capacities for disciplined intuitive skill are as important as disciplined analytical skill. Since the eighteenth century, there have been scientists documenting rigorous and replicable research using non-analytic methods. At the core of these methods is a focus on discerning qualities of wholeness (thus, they are variously referred to as a “science of wholeness” (Bortoft, 1996), or a “science of qualities” (Albertazzi, 2015; Goodwin, 1999).

Suggestion: Consider the ways you approach your systems research work

  • How do you “divide and conquer”? This illustrates the workings of your analytical consciousness.

  • What do you find hard to “divide and conquer”? This hints at properties of inherent wholeness you may be intuiting that cannot be well understood via analysis.

Why should it be necessary for systems researchers to develop highly refined capacities to perceive qualities? To thoroughly describe the qualities of a thing you must describe its relationship to other things—that is, you must describe its qualities of relatedness. Relational qualities point to unifying structures, systemic ordering principles—that is, wholeness—exerting influence on parts in ways that are often obscured by the wide number and diversity of parts involved in complex phenomena.

It is not easy to discern qualities of systemic relatedness among parts, particularly when those relationships operate across large distances of space or time. Yet these qualities of relatedness are precisely the qualities that analytically oriented science often fails to understand, and precisely the unique contributions that systems researchers can make. It takes considerable skill to convincingly communicate a system’s qualities of relatedness to others (Varey’s Chapter 6 of this book is dedicated to this challenge). Laypeople and scholars alike are used to seeing the world as filled with fundamentally discrete objects, events, and ideas. For most people, parts seem fairly manageable, tangible; we experience wholes (aka systems) as more difficult, abstract, and somehow less real than the parts of which they are comprised. To the systems researcher, however, parts are interesting precisely because of their qualities of connection to larger systems: To the systems thinker, “A part is only a part according to the emergence of the whole which it serves!” (Bortoft, 1996, p. 11).

Suggestion: A thought experiment

Consider several data points you have collected that appear unrelated:

  • Under what circumstances, by what ordering principle, would these seemingly unrelated data make perfect sense? In what ways might they be necessary indicators of a particular systemic wholeness, localized expressions of a greater wholeness you don’t yet understand?

  • Before discarding data as outliers, invest time in this thought experiment, assuming that the mere fact they occurred could indicate they express particular, significant qualities of a pattern of systemic wholeness.

At its most fundamental, systems thinking requires the ability to discern that objects, events, and ideas relate to one another in ways that create systems. We might frame this as parts-driven systems thinking. A subtler systems thinking skill is to discern how every part contains qualities of a wholeness of which it is a part. Consider what “wholeness” means according to the Oxford English Dictionary: “completeness, being in an unbroken or undamaged state.” From the perspective of wholeness, the presence and behavior of parts are exactly necessary for the expression of that particular wholeness. We might frame this as wholeness-driven systems thinking. When we consider the deeply embedded systemic complexity of the universe—that every system is a part of many wholes—we must consider that every choice to alter a system is a choice to break or damage a particular expression of wholeness, completeness, which likely impacts multiple expressions of completeness far beyond what we can predict.

Suggestion: Consider a university where you have studied

  • Parts-driven systems thinking: In what ways are its students, faculty, and staff necessary to the existence of that university?

  • Wholeness-driven systems thinking: How does that university express its qualities through the experiences of the students, faculty, and staff that comprise it?

Implicit in a systemic worldview, the qualities of wholes and the qualities of parts each inform the other. This demands systems researchers to develop the ability to orient themselves to both, not privileging either, in a movement akin to the hermeneutic circle. Systems researchers must be competent in both parts-driven systems thinking and wholeness-driven systems thinking. Physicist and philosopher of science Henri Bortoft describes the parts–whole relationship:

Parts are the place of the whole where it bodies forth into presence. The whole imparts itself; it is accomplished through the parts it fulfills.…[Wholeness] emerges simultaneously with the accumulation of the parts, not because it is the sum of the parts, but because it is imminent within them.

This process tells us something fundamental about the whole.…If the whole becomes present within its parts, then a part is a place for the “presencing” of the whole. If a part is to be a place in which the whole can be present, it cannot be “any old thing.” Rather, a part is special and not accidental, since it must be such as to let the whole come into presence. The speciality of the part is particularly important because it shows us the way to the whole.…The way to the whole is into and through the parts.

The whole is…not to be encountered by stepping back to take an overview, for it is not over and above the parts, as if it were some superior, all-encompassing entity. The whole is to be encountered by stepping right into the parts.…There is dual movement: we move through the parts to enter into the whole which becomes present within the parts. When we understand, both movements come together. (Bortoft, 1996, pp. 11–12)

This passage hints at the intricate skill demanded of systems researchers seeking to understand the qualities of both parts and wholes. In a systemically structured universe, we seek to perceive the qualities of many diverse parts because they clarify the wholeness of the system in which those parts operate. At the same time, we take the position that the particular qualities of wholeness being expressed by any given system are necessarily and accurately expressed through the diversity of the parts that comprise it.

Fundamentally, systems researchers must possess competency in perceiving systems. We turn our attention now to competencies required to research systemic phenomena: competencies necessitated by complex systems, analogical reasoning, engaging with the unknown, and reflexivity.

Complexity-Driven Competencies

Often, the systems a systems researcher chooses to investigate are complex. Complexity itself demands much of researchers. Understanding a phenomenon as a system exhibiting characteristics of complexity means that one cannot rely on research strategies aimed at simplifying phenomena. Rapoport and Horvath (1959) agree:

The parts being simpler, they are supposedly more amenable to understanding. The idea of analysis is to understand the working of the parts…the hope is that it is possible to “build up” the understanding of complexity by superimposing the workings of the various parts. (p. 87)

Complex systems elude comprehensive understanding by analytic attempts at superimposing. Neither can we assume that entities comprising a system are arranged randomly, lessening the confidence we can place on statistical probabilities (given that probability statistics work best when we can justify assumptions of randomness). With complex systems, we find ourselves in a middle ground between simplicity and chaos (Weaver, 1948). Well-regarded skills useful to other subjects of inquiry must take a less central role in complex systems research. Other competencies become paramount when one seeks to understand the “deep nature” of a complex system (Lewin, 2002). Here, we propose four: competencies in perceiving order, change, relationship, and information.

Competency in Perceiving Order

We live in a universe that produces information, and we are grammatical creatures, right down to the forty-six chromosomes producing our biological essence. —Robin Wood, Beyond “e”: Creating an Intelligent World

For many, the thrill of systems research is identifying order where others have not. Order involves a particular arrangement of entities—be they ideas or automobiles, galaxies or germ cells—configured in space and time in an identifiable pattern. Order is the grammar of a particular system. The search for order involves the search for occurrences of objects or events that happen with regularity, exhibiting some kind of invariance amidst seeming change, some kind of self-similarity at different levels of analysis. The arrangement of objects and events in complex systems often appears haphazard; systems research is a means by which we can identify how such apparent disarray may be ordered by relatively simple rules. Regularities may be rough; they may also be found to be quite precise (Mandelbrot, 1963). In complex systems, order is oftentimes highly intricate.

This intricacy can emerge in dynamics of self-organization, whereby local interactions give rise to behaviors coordinated at a large scale in a complex system. Self-organized order can be difficult to detect. It can involve behaviors that are steady, yet not repeating themselves precisely, thus making the pattern less obvious to detect. Systemic behaviors often combine a measure of predictability and chaos—as in cases where behavior in a system is unstable at the local level yet the system remains stable at the global level. It is unsurprising that the term “strange attractor” (Ruelle, 1971) was coined to describe some kinds of systemic self-organization!

Suggestion: Think about self-organization

  • Some assume that self-organized dynamics, occurring as they do without intentional design, must reflect natural laws that are therefore optimal.

  • Adam Smith’s argument about the “invisible hand” of markets is an example. However, “there is nothing intrinsically ‘good’ about the outcomes of self-organization.…Self organizing processes happen. There is no intrinsic superior morality or ‘best fit’ that emerges from the process” (Boulton, Allen, & Bowman, 2015, p. 45).

  • Self-organization is a dynamic that gives rise to generative social change and to pandemics, to climate change and to the birth of new galaxies. It is an amoral phenomenon.

The ability to perceive order grants a systems researcher the ability to perceive forces for homeostasis that have developed in a system over its life span. Complex systems have an array of mechanisms for self-regulating, for maintaining states of equilibrium. Sometimes, systemic order itself is the focus of the systems researcher’s inquiry: evidenced in countless studies aimed at examining the nature of and conditions involved in ideas such as “balance” and “sustainability.” Perceiving order allows one to perceive anomaly within a system, which may signal the weakening of one ordering principle and shifting of the system toward another (i.e., a destruction of an old order, as a system’s behavior gives way to a new one). Complicating matters, some systems can only achieve order through some degree of instability, as in the case of complex adaptive systems (Allen, 1997), which must continually respond to both internal and external events in order to maintain themselves. This demands of the researcher an openness toward both stability and its absence in order to accurately perceive and report on both.

Competency in Perceiving Change

Sometimes, systemic change is the focus of the systems researcher’s inquiry evidenced in countless studies aimed at examining the nature of and conditions involved in ideas such as “evolution,” and “emergence.” Complex systems are dynamic; they change. Change takes many forms and demands of a systems researcher the ability to perceive and describe qualities of movement—difference over time or across space. Studies with these aims can subtly pull the researcher into an aversive relationship with the system’s homeostatic, self-regulatory behaviors. Our challenge is to objectively discern and report on the functions and effects of both stasis and flux.

The nature of change in complex systems research exerts certain demands not faced by other researchers. For example, we need subtlety in ascribing magnitude. Given the nonlinearity of complex systems, “there is not a direct and easily predictable linear relationship between an agent’s actions and the consequences of that action” (Lissack, 2002, p. 22). For researchers of simple systems or nonsystemic phenomena, the demands to classify behavior as important or unimportant, significant or negligible, big or small are relatively uncomplicated. Nonlinearity as a systemic characteristic makes the demand to ascribe magnitude challenging for the systems researcher. “Big” is not necessarily powerful; “small” is not necessarily weak.

Perceiving change requires a researcher to discern between regularity and novelty. In a changing system, novel actors, actions, and ideas emerge, at times quickly, often unpredictably. Detecting them demands a systems researcher to have flexible and responsive sensemaking structures (Weick, 1993); otherwise, early indicators of change will be unintentionally filtered out of the researcher’s perceptual field.

Competency in Perceiving Relationships

The etymology of the word system includes the idea of “things placed or set together” (Partridge, 2009). For this reason, a most rudimentary competency in systems research involves recognition of multiple things involved in a phenomenon, requiring a researcher to seek out and examine each of them in their own right. This competency is so foundational that for many people, the phrase “systems thinking” means just this: that by virtue of their pluralistic nature, systems have more than one thing with which we must contend.

Yet, alongside this rudimentary competency driven by pluralism, the work of systems research is to investigate the way things are “placed or set together.” An important competency for a systems researcher is the ability to perceive the particular placement of entities comprising the system under study, and the particular qualities of their togetherness. Put simply, a systems researcher must have the ability to perceive relationships. In a complex system, this is not straightforward. In any given system, parts may relate in highly coordinated ways, may overreact or underreact to one another, or relate not at all. The action one part produces may be minimal or extreme at different points in time. Much of the work of systems research involves exploring the particularities of relationship among parts of a system that one usually finds are densely interconnected in unexpected ways. Parts of a system can be related in ways characterized by mutuality or asymmetry. Entities within a system can behave in ways that are dependent or counter-dependent. The actions and reactions that occur among related elements of a system can be regular or can vary. In systems research, we generally assume that actions of one element of a complex system will have consequences for another.

The relatedness that characterizes complex systems demands particular perceptual skill of the systems researcher pertaining to the issue of independence. It may appear that certain parts of a system operate independently. If the system under investigation is a living or social system, such appearances are illusory; it is not possible for entities involved in living or social systems to be independent of others. However, independence is a coveted cultural value, particularly in Western nations (Hofstede, 1984). Systems researchers who ascribe positive meanings to independence may unintentionally search for it or interpret data in a way that assumes appearances of independence uncritically. A competency vital to the researcher of complex systems is the ability to investigate apparently independent entities for their relationships with other entities. Doing so opens the possibility of uncovering relationships that may be subtle, yet crucially important to understanding the functioning of the system.

Just as in complexity we cannot presume the action of one entity has no effects on others, likewise we cannot presume that the proximity of entities dictates the strength of attunement they have to each other. In much scholarly inquiry, outlier data can be safely discarded as measurement error, or as merely irrelevant. However, complex systems can exist across widely distributed space and time. It is not always the case that relationships among objects or events within a system are significant only if they occur with relative nearness. Complexity demands the systems researcher develop a different stance toward time: “In living systems, even very simple ones, the behaviour at a given time is partly determined by memory and partly by the anticipation of the future. In this sense the future contributes to the present” (Prigogine, 2001, p. 225).

Competency in Perceiving Information

Perceiving the material objects within a system is easy. Perceiving the information that flows among them, and among the system’s past, present, and future, is not. Systems are communicative. They do many things with and to information: import it, attract it, create it, digest it, stockpile it, move it around. At times, they actively ignore it (Mendonça, Cardoso, & Caraça, 2015), suggesting that systems can have an immune system to certain information perceived as threatening. Given the dense interconnectivity within complex systems, information exchange is an important dynamic to systems. The way a system engages with information can affect its behavior in both productive and counterproductive ways. Unsurprising then, systems evolve complex signaling processes to communicate information. Given the survival at stake in relating to information well, signal detection can be crucial to a system’s continuance (Snodgrass, 1972); failure to detect “weak signals” (Ansoff, 1975) can cause a system’s demise.

Given the complexities of many systems, discerning the presence of information (crucial to being able to link causes with effects) can be challenging. Systems research is stimulus-intense work. Amidst those stimuli, we often perceive “incomplete, unstructured, and fragmented information” (Mendonça et al., 2015, pp. 6). The systems researcher generally cannot afford to focus solely on qualitatively obvious or quantitatively high levels of information exchange within a system (i.e., strong signals). Organizational researchers have noted that relationships among parties engaged in weak signaling can play potent roles connecting otherwise disconnected parts of a network and importing new variety where regions characterized by strong signaling do not (Granovetter, 1973). In effect, information exchange that is weak can be strong in its effects on the system’s overall functioning. The challenge for systems researchers is to detect information and to ascribe accurate meaning to it. This is no small feat. Every researcher’s training and life experience creates in cognitive-affective frameworks described (also called worldviews [Kant, 1790] or mental models [Craik, 1943]). Those frameworks are mixed blessings: the education enabling our interpretive capacities as systems researchers provides us with dominant thought patterns that prevent us from perceiving information that seems inconsistent with what we know. Mendonça et al. (2015) describe the situation:

A crucial feature of weak signals is their “weird” character.…Their weirdness is related to a gap in the current dominant frameworks of thought; hence what they imply is difficult to articulate in the context of the established grid of knowledge parameters. (p. 6)

Weak signaling is not created only by the systems we study; it is created by the ways we have been trained to think.

Competency in Analogical Reasoning

General system theory is not a search for vague and superficial analogies. Analogies as such are of little value…(von Bertalanffy, 1969, p. 33)

Analogy is an indispensable and inevitable tool for scientific progress. I mean a special kind of similarity which is the similarity of structure, the similarity of form, a similarity of constellation between two sets of structures, two sets of particulars, that are manifestly very different but have structural parallels. It has to do with relation and interconnection.(Oppenheimer, 1956, p. 129)

Many systems researchers consider Ludwig von Bertalanffy a founding father of general systems theory, and have been greatly inspired by his goal of a general theory of systems that could unify knowledge across disciplines. Echoing him, many systems researchers argue that the work of identifying systemic isomorphies is not a matter of inventing analogies. Their concern is understandable. When an American president drew an analogy between Saddam Hussein and Adolf Hitler—an “indiscriminate inference,” or “naïve analogy” (Hofstadter & Sander, 2013)—it galvanized widespread public support for what became a lengthy and destructive Persian Gulf conflict. A mode of reasoning that can lead nations to war is, understandably, one that could do much to discredit scholarly work.

Yet, what should we make of Robert Oppenheimer’s claim that analogical reasoning is indispensable? Contemporary cognitive psychology supports his claim: “Analogy is ‘the core of cognition’” (Gentner, Holyoak, & Kokinov, 2001, p. 2). Of particular significance to researchers, Gick and Holyoak (1983) said:

To make the novel seem familiar by relating it to prior knowledge, make the familiar seem strange by viewing it from a new perspective—these are fundamental aspects of human intelligence that depend on the ability to reason by analogy. This ability is used to construct new scientific models, to design experiments, to solve new problems…to make predictions, [and] to construct arguments. (pp. 1–2)

Contemporary psychology should make it difficult for today’s systems researchers to dismiss analogical reasoning as soundly as von Bertalanffy did.

Simply put, analogical reasoning is a way the human mind identifies commonalities between two or more situations. Humans take knowledge about one situation and transfer it to another through a process that involves two aspects:

  • 1.Finding “one to one correspondences” (Gick & Holyoak, 1983, p. 2) between the objects in a familiar situation and the objects in a new one; and,

  • 2.Finding correspondences in the way objects in the familiar situation are related, and identifying the relationships among objects in the unfamiliar one.

Thus, analogical reasoning “is a process of aligning object representations and relational structures” (Dietrich, 2010, p. 335) from one situation to another. When we reason by analogy in systems research, we search for (a) similar objects in more than one system that are (b) relating in similar ways.

Importantly, when reasoning analogically, the search for correspondences is often incomplete. This does not signal the failure of the thinking process; rather, it signals the systems researcher to identify the nature of the differences, find ways to extend one’s initial mapping assumptions, isolate key distinguishing principles, search for additional information about what was initially less well understood, to create additional knowledge. Thus, important contributions to knowledge can be made when a researcher identifies insightful analogies and when a researcher identifies un-useful disanalogies (Oppenheimer, 1956).

Any mode of reasoning that helps one focus on “relational commonalities independently of the objects in which those relations are embedded” (Prieditis, 1988, p. 64) has much to offer the scholar seeking to extend general systems theory whose central aim is to identify universal characteristics and principles governing all types of systems in all fields of study. Analogical reasoning involves discovering forms of relationship that are independent of context (Barsalou, 1982). True, not all systems researchers aim to further general systems theory. Analogical reasoning is, nonetheless, a useful cognitive tool. If used well, analogical reasoning can assist one in going beyond surface features of a system to detect commonalities in the relational structures at work in that system, and the dynamics known to operate in other systemic phenomena.

Suggestion: Two stances on context in systems research

  • Focusing on context-dependence is crucial: when you are seeking to understand the boundary characteristics of an open system, or the information/energy flows between the system and the environment in which it’s embedded.

  • Focusing on context-independence is crucial: when you are seeking to identify common ordering principles operating in seemingly different systems or isomorphic patterns across disciplines.

Let us highlight characteristics of sound analogical reasoning: First, mapping objects between two or more systems (or parts of a single system) is good. Mapping commonalities in the ways those objects relate among themselves is better. In analogical reasoning,

the one-to-one mapping principle states that each element in a base domain can be mapped to at most one element in the corresponding target domain. The systematicity constraint requires that, all else being equal, correspondences between systems of elements in the domains are preferred to matches between isolated elements. (Dietrich, 2010, p. 335)

In the psychological literature, this cognitive reasoning has a great deal to do with how one responds to the unknowns that an analogy reveals. A good analogy reveals what differing situations or systems have in common, and as importantly, points to further inferences that can be made about what their differences can teach us (Gentner & Colhoun, 2010). This is invaluable for scholarly research: “The lack of deductive certainty in analogical reasoning…means that analogy can suggest genuinely new hypotheses, whose truth could not be deduced from current knowledge” (Gentner, 2015, p. 108).

Suggestion: Consider analogical reasoning – the peril and the promise

  • Peril: Beware of enthusiastically identifying “pure matches,” object to object, and seeing your work as done.

  • Promise: Get excited about gaps you find in the partially shared structure of two different sets of data (i.e., where an analogy appears to fail because it is incomplete). This is a discovery that demands you examine those gaps, develop explanations for them, extract causal principles, identify new questions, and (hopefully!) revise your initial certainties in light of the ways you found the analogy didn’t quite work.

Third, exciting scholarly contributions to systems research can be made when analogical reasoning uncovers similar structures across “psychological distance,” in “semantically distant domains” (Dietrich, 2010; Liberman & Trope, 2008). It is not easy to detect analogical similarity between one system and one that is seemingly unrelated (in one study only about 20 % of high school students could engage in problem solving that required them to make an analogical connection to a semantically remote problem they had just solved [Gick & Holyoak, 1983)]). As proponents of general systems theory know, identifying analogical similarities in systems that operate in vastly different locations, times, social settings, or hypothetical circumstances is not easy. Perhaps for exactly this reason, it can address important gaps in what is known about the world in which we live.

Competency in Engaging With the Unknown

It is worth stating plainly that writing about skillfully dealing with the unknown feels like an exercise in futility.

Writing that sentence took 8 minutes. Why would that be? If one wishes to write about the unknown, what does one write about? The unknown activates in us a sense of uncertainty, a feeling of inexactness contrasting the feeling of resolution we have when we believe we know something. So, to write of the unknown, we could set out categories of things a systems researcher is unlikely to know at various stages of a research project. We could articulate the kinds of things researchers can know that they do not know, and could contrast them with kinds of things researchers will not know they do not know. Considering each of these, it is difficult to avoid the distinct possibility we have undertaken an exercise in folly. This sense—of being presented with multiple possibilities, any of which could be fruitful or useless—is how the ambiguity of the unknown feels. And, we argue, it is a felt experience that systems researchers must learn to competently engage.

The sense of being in the presence of the unknown is intrinsic to systems research. The systemic phenomena we study are usually rife with unknown interrelationships. They have a propensity for operating in spontaneous, unpredictable ways, described in colorful language such as “the zone of fruitful turbulence” (Smith, 2007) and “the edge of chaos” (Langton, 1990). As discussed in Edson and Klein’s Chapter 3 of this book, the ill-defined problems in which systemic processes are implicated are ambiguous. Ambiguity, by its very nature, involves a sense of something going “on both sides” (Partridge, 1958), a Janus-faced acknowledgment that in systems research there are always two (or more) plausible explanations for why a system is behaving as it is. We could remind the reader that the unknown is intrinsic to systems research for all of these reasons, but doing so would place the focus of attention on the external object—on the systems we study.

Instead, we shift from exterior to interior, directing the reader’s attention inward. For our experience of the unknown resides within the researcher. And handling one’s interior experience of ambiguities and unknowns is a crucial place where systems researchers can operate competently, or otherwise.

Unknowns are the raison d’être of research. We engage in research because something is not understood, or at least insufficiently understood. Our objective as researchers is to transform what is unknown into the known. Why people would engage in such difficult work is more than a matter of a cognitively satisfying exercise. Engaging with the unknown is an affectively charged experience as well. Competently engaging with it is a complex cognitive-emotional skill. There are an array of skillful and less-skillful ways researchers can meet the unknown.

In organizational theory, the unknown is problematic—a threat. Seen as disruptive (Ansoff, 1975; Erlenkotter, Sethi, & Okada, 1989), surprises are viewed as unwelcome discontinuities to be avoided (King, 1995; Weick & Sutcliffe, 2001). They signify failure (Buckle, 2005). Beyond corporate life, people find uncertainty threatening (Freud, 1966), and a great deal of human behavior is oriented toward regulating the experience of anxiety it creates (Reiter-Palmon & Robinson, 2009).

There are many ways of meeting the unknown; avoidance is one of them. Humans prefer predictability; we have evolved an array of defense mechanisms (individually and collectively) for reducing our discomfort when confronted with unknown or ambiguous circumstances (Argyris, 1986; M. Lewis, 2000). Our modus operandi is to utilize what we know as guideposts to interpret the reality in which we operate. This happens in research as well. When information surfaces indicating that what we know is insufficient for understanding the phenomenon we are researching, our minds often block it out:

People are self-corrective systems. They are self-corrective against disturbance, and if the obvious is not of a kind that they can easily assimilate without internal disturbance, their self-corrective mechanisms work to sidetrack it, to hide it….Disturbing information can be framed like a pearl so that it doesn’t make a nuisance of itself; and this will be done, according to the understanding of…what would be a nuisance. (Bateson, 1972, p. 435)

In other words, one unconscious strategy researchers use to handle the unknown is to avoid seeing it (Budner, 1962).

Researchers also respond to the unknown by working to reduce it. As discussed, we analyze large problems by segmenting them into smaller pieces that are easier to handle (Bellak, 1974). We polarize what we perceive into either/or distinctions (Bouchikhi, 1998). We use data reduction strategies to condense large volumes of information into numbers and formats we find easier to handle: in qualitative research techniques like thematic analysis, categorization, and drawing models are designed to reduce the amount of data we work with; in quantitative research, “smoothing” data points to eliminate “noise,” and hierarchical cluster analysis are ways to transform much data into much less, for ease of understanding (Aldenderfer & Blashfield, 1984; Miles & Huberman, 1994; Namey, Guest, Thairu, & Johnson, 2007). The challenge of managing the unknown by its reduction, regardless of the data reduction techniques legitimized in our discipline, is that reducing uncertainty with too much conviction or too soon can obscure our ability to perceive connections, to detect relationships in the system under study that may be operating in subtle, yet perhaps potent ways.

A different, equally common response to the unknown is increase. An ambiguous situation is “one which cannot be adequately structured or categorized by the individual because of the lack of sufficient cues” (Budner, 1962, p. 30). Accordingly, one way to cope competently with our sense of lack in ambiguous circumstances is to get more information. With more data, the tension of lacking data will ease in us. Of course, with too much more data, the desire to discharge tension by reducing the amount of data we are taking in will intensify. The situation is like walking a razor’s edge: between the ambiguity-driven need to gather more information and the ambiguity-driven need to have less information with which to contend. How much information a researcher must get, then, is a challenging judgment call. The easiest way to make it is to defer to tradition, citing the judgment of researchers who have gone before us. Generally, such researchers frame their prescriptions in terms of the demands of the research question and of quality standards established by our academic community. The researcher’s own personal ability to cope effectively with the felt experience of having “too much” or “too little” information is rarely considered.

Beyond academic prescriptions, researcher behaviors aimed at data reduction and increase are predicated on one’s felt sense of too—too much or too little information. This sense of too is worth examining for what effects it has on the research, and on the researcher. Implied in too is that each of us has some comfortable capacity for carrying some quantity of information at any given time in our research work. Strains to that capacity (in the form of a feeling of too much or too little) get registered by our minds as disturbance, activating our unconscious cognitive strategies for hardening that sense of disturbance into a pearl, as Bateson (1972) has prettily described. Pearl-making is a strategy that accomplishes a satisfying encapsulation of problematic discomfort in our minds; pearl-making also obscures deeper or broader inquiry into the phenomena which is causing the disturbance. This is a critical issue for the systems researcher. Willingness and capacity to take one’s inquiry in directions of both depth and breadth is crucial for systemic phenomena. This is guaranteed to bump researchers up against the uncomfortable sense of having too much and too little information. When we seek to foreclose against that discomfort, we sacrifice engaging with the system we are studying in broader and deeper ways.

The matter of too, then, is a matter of systems researchers needing to engage in relationship with a large quantity of information with a variety of qualities in a way that is not averse to those quantities and qualities. Humor helps. So also are reminders that research involves swimming in unknown, uncertain territories. The ambiguities we encounter are merely the space where our mental models and real-world phenomena meet in unclear ways; where prior scholars failed to find answers, where no one yet thought to ask the questions you are now, where a newly formed phenomenon does not appear to fit what we already know. Phenomena that do not reward us with easy clarity, that do not reinforce questions that have already been asked and answered—these open new lines of inquiry (McCormick & White, 2000) are desirable (Budner, 1962). Ambiguous phenomena that act on our perceptual apparatus as too (-much and -little) and framing these as compelling rather than disturbing, distinguishes skilled systems researchers from others.

By virtue of the complex nature of the systems we like to study, systems researchers place themselves into inquiry spaces that demand considerable polychronicity. Polychronicity refers to one’s coping capacity in stimulation-rich, information intense environments (Haase, Lee, & Banks, 1979).Footnote 1 Researchers, like all people, vary in how polychronic they are, that is, how much information they can accurately perceive in a stimulus-rich environment. Scholarship on this psychological attribute is ongoing, but it seems reasonable to propose a few things:

  • If researchers are to account for the real-world complexity inherent in a system they must develop finely tuned polychronic abilities to perceive and interpret a great number of stimuli that vary widely in qualities and intensity, operating across time and space, and exerting varying potency in shaping the system’s behavior.

  • Systems researchers require greater degrees of polychronicity than researchers who focus their inquiries on smaller or simpler parts of complex phenomena.

The capacity to be polychronic demands an openness to information-dense, ambiguous research settings; that is, it is antithetical to the view that high levels of stimulation and information are threatening.

In the psychological literature, this cognitive capacity is generally framed as “tolerance of ambiguity” (Frenkel-Brunswik, 1949). More than tolerating it, systems researchers must possess great competency in attuning to the ambiguities that arise as they engage in systems inquiries. For systems researchers to be highly competent in this area, they must possess: (a) the capacity to take in a great deal of information (polychronicity), and (b) the capacity to skillfully choose how to engage with that information while experiencing the tension produced by simultaneously holding divergent data, apparent incommensurabilities, puzzling paradoxes, and so forth. The urge to discharge this tension is understandable. Indeed, an important function of research is probably to reduce some of the existential anxiety inherent in the human condition by discoveringFootnote 2 order and structure in the world we share. However, it is an important competency for you as a systems researcher to not only tolerate ambiguity by its resolution or elimination, but to become intimately familiar with how ambiguity presents itself to you and how you allow yourself to be informed by it.

Suggestion: Attuning to your experience of ambiguity

  • Recall a point in your research when you felt bombarded with information. What was your felt experience of being in this polychronous research situation? What can you learn from that felt experience?

Some examples:

  • Confusion? This generally signals a conflict between an assumption you had based on prior knowledge or experience versus the real-time phenomenological experience of how the information presented itself to you.

  • Annoyance, frustration? This signals the failure of your phenomenon to fit standards you had set, categories you believed in, or mental models you no doubt worked hard to adopt or develop.

The aim in attuning to our experiences of the uncertain, ambiguous unknown is to get to a state of curiosity. The equally important aim is to notice when we are not curious, and to become curious about that! From a place of curiosity, systems researchers can cultivate productive naiveté and useful confusion. With a playful attitude toward the limitations of our knowledge and abilities at any particular point in our research, we can struggle more skillfully. By experiencing at once the intensity of a desire to understand and the awareness that we do not understand how a system works, we train our minds in a keenness of focus and active attention that better enables us to detect the subtle processes of disorder and coalescence that characterize systemic phenomena. Rather than privileging certainty over confusion, systems researchers should actively cultivate ambidexterity in skillfully engaging with both.

Competency in Reflexivity

Thinking about thought is notoriously difficult.(Rajahalme (2008, p. 43)

Research implies endurance. It involves sustained attention, “continuous experiential contact” (Seamon & Zajonc, 1998)—months or years of circumambulating a particular phenomenon in the quest for understanding. Every researcher must become familiar with the tools of the trade: software programs, data collection procedures, the equipment that enable us to do the work we do. The most important of these tools is the person of the researcher her- or himself—the unique constellation of capacities, training, skills, and personality attributes that form the perceptual apparatus central to the research endeavor. As systems researchers, we must become intimately acquainted with ourselves as research instruments—including how we personally think and feel about the work we do. The world of contemporary systems research is one of engagement with complex systems that entrain objects, people, activities, and processes in ways that can be far-reaching, subtle, and potent. As such, we become involved for long periods of time with phenomena producing vast amounts of information for us to assimilate, feeling insistent demands on our attention to consider yet another perspective. We engage with phenomena that, quite likely, will entrain us in their dynamics in unanticipated ways. Systems research implies immersion.

Suggestion: Consider entrainment

  • Refers to the ways actors, activities, and objects tend to be pulled into synchronization. By virtue of working on, with, or in a system for a sustained period of time, a systems researcher will observe ways that certain dynamics influence the activity happening therein (McGrath & Rotchford, 1983). Systems researchers are susceptible to entrainment by the same logics operating in the systems they study (Ancona & Chong, 1996). Just as ideas and influences operating a socio-technical system interpenetrate all stakeholders in varying ways, they can likewise penetrate the assumptions and actions of the systems researcher, although we often cannot predict how or when this will occur. Our attunement to a system can lead to entrainment in the very systemic dynamics we study.

As matters of quality assurance and ethics alike, any researcher should become intimately familiar with “our own personal involvement in how we usually meet the world” (Wahl, 2005, p. 62). There is something about researching the systemic nature of nature itself that seems to make deep familiarity with oneself particularly crucial. Systems research seeks to engage directly with how our universe operates in deeply relational ways, expressing coherence and wholeness quite at odds with our scientific traditions of seeing the world as comprised of atomistic parts. Perhaps because of this departure from tradition, discerning systemic wholeness during a research project often involves deeply meaningful moments when one perceives disparate parts of data coalesce in newly seen ways. Theoretical biologists and complexity scholar Brian Goodwin describes such experiences this way:

[Assembling] observations together to get a coherent whole… is generally accompanied by a sense of the elegance and beauty of the natural world that is experienced as deep aesthetic pleasure. And this is regarded as something of a touchstone for the truth…[it] involves both subjective experience and a qualitative evaluation: elegance, simplicity, beauty, truth, are the most common descriptors. The resulting theory often has the property of parsimony. (Goodwin, 1999, “From Quantities to Qualities,” para. 2)

In an atomistic universe, researchers are schooled to view such subjectivity as a threat to high-calibre research. Thus, researchers must continually check their insights against those of “a community of individuals practising procedures of research” (Goodwin, 1999, “Color Experience,” para. 5) appropriate to the phenomenon under study. This is how rigorous research is done.

However, contemporary scholarship calls us to a different relationship with our individual subjectivities. Goodwin (1999) contends:

There is no intrinsic reason why our feelings should not carry real insight into the nature of the processes we experience in nature. Just as a sense of the elegance, beauty, and truth of the scientific insight are experienced as significant indicators of the real value of an idea, that make sense of the diverse observations, joining them together in a consistent unity, so feelings associated with observations of the phenomena can be significant indicators of their real natures, giving them meaning. (“Color Experience,” para. 5)

It may well be that systemic wholeness is an instance of qualia—a quality of nature that can only be subjectively experienced. This suggests we should treat moments of meaningfulness, elegance, parsimony, and so forth in systems research as indicators of potentially important qualities of systemic wholeness, not merely as emotional reactions that cloud our inquiry. Our subjective responses to the systems we research may reveal intrinsic properties of those systems, not merely our own stuff.

The art of good research, then, is to use one’s perceptual apparatus in actively receptive ways (Bortoft, 1996) that do not dominate what we observe by our preconceived ideas. At times we must seek to take in information in as unfettered a way as possible; at other times we must seek to impose intellectual structure on what we have perceived (Seamon & Zajonc, 1998). The challenge of research is to recognize that human minds do both. We must work to develop an attunement to when one or the other is holding sway within us.

From a systems perspective, however, one’s subjectivities are not merely private idiosyncrasies. Rather, they are the ways certain systems of thought have claimed us, governing the ways we think and feel, the values we privilege and those we de-emphasize, the normative opinions we hold and the ethics we espouse, the power relations we notice and the thousands of judgment calls we make through the span of a research project. From a systems perspective, objectivities are not unquestionable truths, or “raw reality” (Boulton et al., 2015). Rather, they are products of “science, [which itself is] the expression of a culture…not a given; it implies a construction in which we take part” (Prigogine, 1996, p. 39). In this book, Edson and Klein, Sankaran, and Varey have written about recognized stages of doing good systems research. Their guidance in justifying the ways to design a study, gather data, analyze them and make claims about research findings, are all grounded within the logic of the scholarly and cultural systems in which we are entrained. Your lifelong work as a systems researcher is to take none of those logics as unquestionable, to become intimately acquainted with their merits and blind spots, so you can know what agendas your research is furthering and accept accountability for that.

Conclusion

To conduct systems research is to participate in a form of inquiry that works in a different direction than the heavily biased view that the universe is comprised of myriad parts, having relatively little to do with one another.

The atomistic view of the world that separates observer from observed and objects from one another has accomplished tremendous things. More so than any other idea in the history of science, it has enabled us unparalleled understanding of the universe in which we live. As much as systems theorists have critiqued it, we owe much to it. It is likewise true that systems research offers an important complement to traditional modes of inquiry. It allows us to engage more directly with systemic complexity rather than seeking to fragment and simplify it.

It has been said that dealing with complexity “is emotionally as well as intellectually demanding” (Boulton et al., 2015, p. 136). To train oneself to perceive systemic wholeness, with the vast intricacies of relationship that implies, guarantees certain things for the systems researcher. The work will often be confusing. It will frequently feel uncomfortable. It will seem overwhelming.

And if you are seeking to engage in a work and a life informed by the systemic nature of our world, it will be necessary. And quite meaningful.

In this chapter, we have sought to describe several perceptual competencies necessary for the systems researcher. Exciting ongoing research is seeking to identify how to measure and develop these competencies in systems researchers, and to identify other competencies demonstrated by experts in the field. It is unlikely that the competencies discussed in this chapter comprise a comprehensive list of systems research competencies, but we suggest they are fundamental. First among them is the ability to perceive the ways parts and wholeness express one another. The characteristic of systemic complexity demands competencies in perceiving order, change, relationship, and information in a system. Systems researchers must know the difference between using analogical reasoning well and falling prey to naïve analogies. Attuning to the felt experience of the unknown and having strategies for coping with it is important in systems research. Competent systems researchers are reflexive people who seek to understand their unique perceptual strengths and limitations. Together, these competencies contribute to the conduct of effective systems inquiry. In the next chapter, we examine how to evaluate the impact of the systems research we do.