Abstract
As science and technology create an ecosystem that is becoming increasingly more knowledge-intensive, complex, and interconnected, the next generation science standards include systems thinking and systems modeling among 21st skills that should be fostered. We examined the effect of an online cross-disciplinary learning process on the development of systems thinking and modeling skills among engineering students and engineering and science teachers. The study, which used quantitative and qualitative tools, included 55 participants who performed four food-related learning assignments and created conceptual models in Object-Process Methodology. Their responses to online assignments were analyzed along with their perceptions, captured via a reflection questionnaire. The online learning process in this study effectively enhanced systems thinking and modeling skills of all learners, including those with no relevant background. One main conclusion that extends beyond the online learning was that imparting the basics of systems thinking and conceptual modeling skills can be achieved even within a short period of time—less than one semester. The contribution of the study is the formation of theoretical and practical frameworks for the integration of an cross-disciplinary model-based systems engineering online assignments into engineering and science curricula.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
Since the beginning of the twenty-first century, the world has been changing much faster than ever before in human history. This accelerated change must be considered in several fields, including the economic, social, and educational ones (Comfort & Timms, 2018). In the wake of the digital era, educational systems, as an integral and influential part of modern life, must be adapted to the changing reality (Schleicher, 2019). This has become more critical since the COVID-19 pandemic erupted, forcing an overnight transition to remote teaching and learning and highlighting technology-based digital environments and tools (Daniel, 2020).
Digital educational environments seek, at least theoretically, to develop students’ competencies and skills instead of just delivering data from teachers to their students, as was quite common in the not-too-distant past (Care, 2018). Towards the very end of the twentieth century, a considerable number of initiatives started catering to the growing need for a profound change in teaching and learning processes, focusing on the shift from teacher-centered to student- and competencies-oriented education. The National Research Council framework (NRC, 2012a) for the so-called 21st-century skills strongly emphasizes systems thinking as a skill that is central to science and engineering (Comfort & Timms, 2018). Based on the NRC framework, the Next Generation Science Standards (NGSS) (NGSS Lead States, 2013) were formulated. These standards are divided into three groups: (1) scientific and engineering practices, (2) crosscutting concepts, and (3) disciplinary core ideas. One of these crosscutting concepts, i.e., concepts that have applications across all scientific domains, is “Systems and system models. […] specifying […] boundaries and making explicit a model of that system—provides tools for understanding and testing ideas that are applicable throughout science and engineering” [6, p. 84]. The focus of this concept is systems thinking and systems modeling: which this research sets out to explore in the context of engineering and science online education. As systems are becoming more complex and knowledge-intensive (Wade & Heydari, 2014), the demand for systems thinking is on the rise (Cabrera et al., 2021; Koral-Kordova et al., 2018; Lavi et al., 2019; Mahaffy et al., 2018). Systems thinking is the basis for modeling skills, which are necessary for representing, understanding, and planning natural and human-made complex systems, serving both science and engineering (Akiri et al., 2020; Gilissen et al., 2020; Verhoeff et al., 2008; Wilensky & Resnick, 1999).
The current study aims at examining the effect of a cross-disciplinary online learning process on the development of systems thinking and conceptual modeling skills among engineering students and engineering and science teachers.
Theoretical background
In this section, the background of this research is presented with an emphasis on theories and concepts related to systems thinking, conceptual modeling, and engineering and science teaching and learning.
Systems thinking
Systems thinking is relevant to both engineered systems and natural phenomena. In both cases, the term system can be defined as an entity that maintains itself and functions as a whole through its interacting parts. A more recent definition of a system is an arrangement of parts or elements that together exhibit behavior or meaning that the individual constituents do not (Dori et al, 2020). This definition emphasizes the element of emergence and accommodates conceptual systems along with natural and engineered ones.
An engineered system is supposed to provide value to its beneficiary by performing a function in which a process transforms (generates, consumes, or changes the state of) one or more objects. Natural phenomena, in contrast, are not meant to deliver any predetermined benefit, but they may have an outcome, whether detrimental or beneficial (Assaraf & Orion, 2005; Dori & Sillitto, 2017).
Many disparate definitions have been proposed for systems thinking over the years (Arnold & Wade, 2015, 2017; Booth-Sweeney & Sterman, 2000; Grohs et al., 2018; Monat & Gannon, 2015). Assaraf and Orion (2005) identified common points in these definitions of systems thinking: (1) developing the ability to think dynamically, i.e., to understand how a complex system changes over time through its interacting parts, (2) understanding how the interacting parts of the system affect its behavior, and consequently, its function, (3) identifying positive and negative feedback processes within the system, and (4) raising hypotheses and examining them via models. Booth-Sweeney and Sterman (2000) suggested that whatever the definition of systems thinking may be, each of the systems thinking components is based upon rudimentary high school skills and knowledge that include (1) interpreting graphical representations of the data, as well as creating such representations, (2) having basic knowledge and understanding of algebra, statistics, and units of measure, and (3) telling a story based on quantitative data, and vice versa—eliciting such data based on its textual representation. The equivalent of storytelling is materialized in OPM (Lavi & Dori, 2019; Stenger et al., 2021), discussed later, where the formal graphical representation of a system with its structure and behavior is converted automatically to a textual representation in a simple subset of natural language.
To transcend disciplinary boundaries, General Systems Theory (GST) (von Bertalanffy, 1968) was established based on the notion that systems studied in various disciplines and branches share certain common principles and features, and therefore can and should form the basis of the science of systems as a unified foundation for all sciences (Dori, 2005; Fillieule, 2001). In its pursuit of generalizations, GST moves toward a state of science known today as ‘trans-disciplinarity’ (Hofkirchner & Schafranek, 2011), in which all hierarchical levels in the system are coordinated, requiring a “different manner of seeing the world, more systemic and more holistic.” (Max-Neef, 2005, p. 15).
A type of thinking that revolves around complex entities of various kinds, systems thinking is common to all disciplines that involve human cognition. It enables people to make causal inferences about a wide range of biological, ecological, engineering, and social phenomena, regardless of their complexity or interconnectedness levels (Cabrera et al., 2008; Dori, 2005; Wilensky & Resnick, 1999).
Trying to formulate a cross-disciplinary definition, Whitehead and colleagues (Whitehead et al., 2015) suggested that systems thinking is “a thought process through which assumptions are examined about a set of interconnected elements that drive toward a common goal with the objective of discerning hidden values and evaluating evidence in order to assess conclusions” [p. 10]. To put it simpler and even more inclusive, based on Dori and colleagues (2020), systems thinking is defined as a thought process in which assumptions about interactions among interconnected elements of a system or a phenomenon are examined to predict behavior, outcomes, and in the case of human-made artifacts, a value to beneficiaries.
Over the past two decades, a significant number of studies have looked into the development and assessment of systems thinking in various disciplines, topics, and populations, including engineering in higher education (Frank, 2006) and high school (Gero & Zach, 2014), pre- and in-service elementary teachers in the science context (Lee et al., 2019), postsecondary science courses (Golick et al., 2018), graduate students’ skills in economy as a component of sustainability education (Summerton et al., 2019), graduate engineering and computer science students (Schraw, 2007), student teachers of biology and geography (Schuler et al., 2018), and graduate engineering and science education students (Akiri et al., 2020).
Forming a systems thinking framework in curriculum planning is critical for preparing students to tackle complex problems (Bracher, 2021; Eilam & Reisfeld, 2017; Gilissen et al., 2020; Hmelo-Silver et al., 2017; Rosenkränzer et al., 2017; Talanquer, 2019). Promoting systems thinking through cross-disciplinary curricula may be effective in engineering- and science-oriented courses (Gero & Zach, 2014) and in education in general (Lattuca et al., 2004), by following the Piagetian principles formulated in the Theory of Cognitive Development. According to Piaget, cross-disciplinary education is supposed to foster higher-order thinking skills and a deeper understanding of complex topics (Gero & Shlomo, 2021). Since most education systems curricula rarely rely upon cross-disciplinarity (Klaassen, 2018), fostering modeling and systems thinking skills in other disciplines should be explicitly promoted (Bracher, 2021). Engineers, as well as their educators, usually engage in, or are at least trained in, paying attention to details, understanding how things work, and solving problems by dismantling and reassembling (physically or metaphorically) different systems (Brophy et al., 2008; Pleasants & Olson, 2019; Winberg, 2008). Moreover, while in the field of engineering and engineering education, systems thinking has been practiced and studied extensively (Assaraf & Orion, 2005), science and engineering teachers (and teachers in general) usually lack this training or educational background (Arnold & Wade, 2017). Yoon and colleagues (2017) discussed the important role of professional development for teachers, in order to be successful in instruction about complex systems and by increasing teachers’ awareness of using models in science education. In this regard, Pedagogical Content Knowledge–PCK (Shulman, 1987) describes the ability to teach systems using appropriate teaching and learning methods, and not only define them (Rosenkränzer et al., 2016, 2017; Schuler et al., 2018; Streiling et al., 2021). As part of PCK in pre-service teacher preparation, teachers should be able to represent complex systems of various kinds with various models (Lavi & Dori, 2019).
Regarding modeling, Barak (2022) has characterized systems thinking as related to learning electronics in school, and how these characteristics are reflected in home electrical appliances. Four fundamental areas that are crucial to teaching and fostering systems thinking were identified: science and engineering view, innovation, the role of the engineer, and modeling. Utilizing conceptual modeling for the development of thinking skills and their assessment has been proven useful in other recent studies, as it helps simplify complex thinking processes (Akiri et al., 2020; Lavi & Dori, 2019) besides being a stand-alone skill in its own right (Buchmann et al., 2019). While several systems thinking frameworks explicitly mention conceptual modeling (and modeling in general) as a skill under the systems thinking skill set (e.g., Fanta et al., 2019; Riess & Mischo, 2010; Stave & Hopper, 2007), others view it as a related, yet different, skill (e.g., Assaraf & Orion, 2005; Grohs et al., 2018; Lavi et al., 2019). Verhoeff and colleagues (2018) mentioned that a notable difference between different systems thinking frameworks is the extent, if any, to which they consider modeling in general, and conceptual modeling in particular, as an explicit part of systems thinking. To the best of our knowledge, it has not been proven to date whether the two skills are separate or not in terms of the required abilities, and we hypothesize that each skill requires quite different abilities. Therefore, in the current study, we consider conceptual modeling as a cognitive process that is different from systems thinking, and we use complementary research tools to identify and measure each of the two skills.
Although there is more agreement on the importance of systems thinking than on its operational definition (Dugan et al., 2021; Gharajedaghi, 2011), certain operational similarities, as well as differences, can be identified across various disciplines, as presented in Table 1. While some definitions, such as systems engineering, are highly domain-specific (Frank, 2006, 2010), others are more general, as is evident by their noticeable mention in other fields and domains (e.g., Assaraf & Orion, 2005; Stave & Hopper, 2007). In this regard, Gharajedaghi (2011) argued that structure, function, and process (also referred to as ‘behavior’ by others) within their context suffice to define the whole and enable its understanding on both operational and theoretical grounds.
This study aims to measure systems thinking and conceptual modeling skills, specifically the ability to identify in a model the structure, function, and behavior of a system. We examined the application of this ability in several food-related contexts, including chocolate making and cod fish processing. The learning process used for this purpose was designed to be cross-disciplinary. In addition to research purposes, this model-based approach can be applied to teaching and learning various science and engineering subjects.
To achieve this level of generality, we formulated an operational definition of systems thinking that extends that of Gharajedaghi (2011). Drawing on assertions presented in Table 1, the operational definition of systems thinking below covers a host of disciplines and includes measurable aspects of declarative knowledge alongside procedural-practical knowledge ones.
Operationally defined, systems thinking refers to an individual’s ability to perform the following processes:
-
1.
Acquiring a clear understanding of key system terms and concepts,
-
2.
Predicting expected outcomes (in science) or characterizing the intended purpose of a system (in engineering),
-
3.
Identifying the main function of the system, including the main process, the affected objects, and the process enablers, both human agents and inanimate instruments,
-
4.
Identifying interactions of static and dynamic nature through structural and procedural links, respectively,
-
5.
Describing the system at different, increasingly refined levels of detail, and
-
6.
Placing the system within its context as a functioning whole and interconnecting different systems.
These processes that make-up systems thinking are reflected in the learning process we developed, which will be presented in detail in “Methodology”.
Conceptual modeling
To cope with the soaring systems complexity, a prevalent didactic method of a constructivist nature is conceptual mapping: drawing a diagram in which each node describes a concept, and each link, typically a line or an arrow, represents some relationship between the concepts (Dori, 2016; Novak & Gowin, 1984). Although conceptual mapping may reflect students’ systems thinking and improve their understanding of complex systems (Tripto et al., 2013), this method is relatively subjective, slow, and complicated for assessment (Freeman & Urbaczewski, 2020).
Unlike concept mapping, conceptual modeling uses a formal graphical language and unambiguously distinguishes between different types of concepts and interrelationships to effectively organize big data and complex systems (Chen, 1976; Dori, 2002, 2016; Lavi et al., 2019). In information and systems engineering, conceptual modeling is a common method for representing and managing data derived therefrom and system specifications (Wand & Weber, 2002). Conceptual modeling involves the simultaneous expression of one’s ideas verbally and visually (Dori, 2008), a process in which system-related knowledge is represented in both science and engineering (Dori, 2016). As Mylopoulos (1992) put it, conceptual modeling is a formal process of unequivocally describing certain physical and social aspects of the world around us in an attempt to comprehend and communicate it.
Developing and using conceptual models—the main artifacts of Model-Based Systems Engineering (MBSE)—involves creating representations of what the structure of a complex system is and how it operates or behaves over time. The conceptual model enables readers to understand, analyze, design, and present complex systems or the subject they represent (Dori, 2016; Lavi & Dori, 2019; Stenger et al., 2021). Other researchers emphasized the need for ontological scaffoldings (i.e., explicitly teaching students about complex system concepts) in order to help students understand difficult aspects when learning about complex systems and modeling them (Jacobson et al., 2011; Rates et al., 2022). In the current study, we used OPM—Object-Process Methodology (Dori, 2016), elaborated in the Methodology section, as the conceptual modeling language and methodology.
In a recent study that involved 12th-grade students who majored in electronics, Gero and colleagues (2021) examined the relations between systems thinking and abstract thinking, i.e., the ability to perceptually shift between different detail levels (Kramer, 2007). The authors reported a significant correlation between the two types of thinking, concluding that these relations may have practical implications across different education levels. In this context, Dhukaram and colleagues (2018) argued that systems thinking can encourage the abstraction of new knowledge, thereby enhancing conceptual understanding. OPM facilitates complexity management, including shifting between abstraction levels, by using various abstracting-refining mechanisms and therefore can add value to the development of systems thinking (Dori, 2016).
Conceptual modelers can significantly benefit from certain skills and backgrounds, such as technical skills and well-suited, timely training (Davies et al., 2006), along with domain-specific knowledge and the ability to understand, analyze, and make predictions based on the modeling of this knowledge (Mayr & Thalheim, 2021). However, according to Kayama and colleagues (2016), conceptual modeling can and should be taught to inexperienced students with no relevant previously acquired skills or background. They did not observe consistent differences in reading and creating conceptual models between first-year and second-year computer science undergraduates.
A recent systematic literature review (Rosenthal et al., 2019) identified only 121 published papers between 1986 and 2017 that addressed learning and teaching of conceptual modeling. An order of magnitude fewer studies, 13, dealt with learning and teaching of conceptual modeling as a general theme in and of itself rather than its use in a specific aspect, such as object-oriented programming or data modeling. More theoretical, empirical, and design science research is needed to significantly advance our knowledge in various aspects of learning and teaching of conceptual modeling (Kayama et al., 2016; Rosenthal et al., 2019) not only as a tool used by information and systems engineers but as a discipline in its own right that is relevant to and instrumental for many other domains (Buchmann et al., 2019).
Asynchronous online learning
Over the past two decades, education systems around the world have been struggling to remain relevant and effective in face of tremendous global technological changes (Iivari et al., 2020). The outbreak of COVID-19 has further changed people’s lives (Aristovnik et al., 2020; OECD, 2012), including K-12 and academic education (Accreditation Board for Engineering and Technology, 2020). Remote learning has become the default in schools and higher education, forcing an almost overnight switch to online teaching and learning (Aristovnik et al., 2020; Daniel, 2020; Guo, 2020; Iivari et al., 2020), which has intensified processes that had already begun at the dawn of the new millennium (e.g., Lee et al., 2001).
Online learning can be synchronous or asynchronous. The former mandates the simultaneous presence of the learner and the teacher in the same virtual space, while in the latter the learner can choose the appropriate time and pace of learning (Parrish et al., 2021). Some studies suggest that synchronous or asynchronous online learning may enhance engagement in the learning process, whereas others indicate the opposite. According to Henrie and colleagues (2015), this ambiguity is due to the lack of uniformity around definitions regarding the nature of students’ engagement in online learning and its measurement.
A salient feature of asynchronous online learning is the opportunity it provides students to delve deeper into and reflect upon the learning process, especially when complex topics that require considerable mental resources are involved (Hrastinski, 2008; Lin & Gao, 2020; Robinson & Hullinger, 2008). Mastering systems thinking may thus benefit from being applied in such cases. The same is true for conceptual modeling of complex systems (Rosenthal et al., 2019).
The main takeaway from the above literature survey that is most relevant to the focus of this study is that systems thinking should and can be fostered among school and higher education and students in all STEM domains (Assaraf & Orion, 2005; Lavi & Dori, 2019; Lavi et al., 2019; Riess & Mischo, 2010; Talanquer, 2019). This will become possible only after their teachers themselves acquire knowledge and understanding of systems thinking and effective ways of teaching it, namely the acquisition of relevant PCK (Arnold & Wade, 2017; Rosenkränzer et al., 2016, 2017; Szozda et al., 2022; Yoon et al., 2017). In addition, according to several studies, (e.g., Eilam & Reisfeld, 2017; Fanta et al., 2019; Hmelo-Silver et al., 2017; Monat & Gannon, 2015; Stave & Hopper, 2007 Verhoeff et al., 2008) conceptual modeling is an integral part of systems thinking that represents higher order systems thinking and should thus be explicitly included in the systems thinking skill set (Verhoeff et al., 2018). Where modeling is not an explicit and integral part of systems thinking (e.g., Dugan et al., 2021; Frank, 2006, 2010; Grohs et al., 2018), it is still considered as being related to the development and evaluation of systems thinking. More research in conceptual modeling is needed to establish knowledge about learner needs and barriers. Such knowledge will, in turn, better inform the design of conceptual modeling learning tools (Rosenthal et al., 2019). The finding of Kayama and colleagues (2016) reinforced the argument that success in conceptual modeling reflects a high level of systems thinking, and therefore teachers and students may require more time, effort, and experience to master it.
Research objective and questions
The overarching objective of this research was to evaluate the effect of a food-related, modeling-oriented, online cross-disciplinary learning process on the systems thinking and modeling skills of engineering students and pre- and in-service science and engineering teachers. In this study, we separated the two skills in both theoretical and assessment aspects, as explained earlier in the Systems thinking Subsection.
Four asynchronous online assignments were developed to form the learning process, and two research questions (RQs) were investigated:
RQ1
What is the effect, if any, of a cross-disciplinary online learning process on the participants’ systems thinking and modeling skill levels?
RQ2
What are the participants’ perceptions regarding the effect of the learning process on their systems thinking and modeling skills, and what are the challenges they faced while studying?
Methodology
This research was based on the convergent design of the mixed methods approach. In convergent design, researchers intend to combine the results of the qualitative and quantitative data analyses so they can be compared and merged (Creswell & Clark, 2018). This chapter describes the research setting, participants, tools, and data analysis.
Research participants and setting
This research was performed as part of the EIT Food project TRACOD—Model-Based Tracking of Cod and Other Fish Value Chain for Consumer Confidence Boosting and Food Engineers Education (https://tracod.eu/). EIT Food is Europe’s leading food innovation initiative, working to make food production and distribution more sustainable, healthy, and credible (https://www.eitfood.eu/). Topics related to this cross-disciplinary subject are taught in various courses at our academic engineering university. Three of them were included in this research, as detailed in Table 3.
The study, approved by the institution’s Behavioral Sciences Research Ethics Committee (Approval #2020-165, July 16, 2020), included 40 undergraduate and 15 graduate students (N = 55), of whom 34 were females and 21 males. Based on faculty affiliation (Table 3), we divided the participants into two research groups: (1) the engineering student group, and (2) the engineering and science pre- and in-service teacher group. Given each group’s different training and educational background, we assumed that certain group-dependent differences in systems thinking and conceptual modeling performance will be found, as argued in the Systems thinking subsection. Both groups had no previous background in OPM modeling or systems thinking before performing the online assignments, during which they were exposed to OPM while getting acquainted with food production value chain issues related to cod fish supply and chocolate production processes. Each of the online cross-disciplinary assignments included a questionnaire and a quiz, encompassing a variety of assessment methods and question types.
In a recent systematic literature review, Dugan and colleagues (2021), who classified systems thinking assessments in engineering research, found a wide array of assessments in terms of type, format, and content. Eight prevalent formats were identified—concept mapping, scenario, open-ended, oral, fill-in-the-blank, multiple-choice, virtual reality, and brain activity monitoring. Half of these formats were applied in the learning process presented in this study, including concept mapping (OPM), open-ended, fill-in-the-blank, and multiple-choice. This variety, alongside the contextual element that is also prominent in the learning process presented in this study, can be beneficial in systems thinking assessments (Dugan et al., 2021; Eilam & Reisfeld, 2017; Talanquer, 2019). Table 2 presents a detailed description of Assignment 4, including sample questions, the content of the assignment, and an explanation of the difficulty level of the assignment relative to the previous ones. A similar description for the first three assignments is shown in Appendix 1.
Food-related issues are both cross-disciplinary and complex in nature (Allen & Prosperi, 2016; Jagustović et al., 2019; Spiertz, 2012), and almost everyone can relate to food as a life essential. Hence, accompanying food topics with conceptual models makes them especially suitable for developing and evaluating systems thinking and conceptual modeling.
The three courses, in which the online assignments were incorporated as part of the study (Table 3), were taught remotely during the 2020 academic year due to COVID-19 restrictions. It took the participants between 1 and 2 weeks to complete each assignment, a total of about 1 to 2 months for all four assignments that constituted the learning process.
Research tools
The research tools included the four online cross-disciplinary assignments, each of which comprised short videos, photos, text segments, and OPM models (see Table 2 and Appendix 1 for a detailed description). In OPM models, the structure and behavior of a system interact with each other to achieve the desired function, or in the case of natural phenomena—the expected outcome. Each assignment included a set of questions at a gradually increasing difficulty level. Gradual increase in difficulty level is particularly important in online learning (Gillett-Swan, 2017) and learning complex systems using modeling (Hung, 2008; Turnham et al., 2012). Figure 1 presents the composition of the questionnaires and quizzes in terms of the question type (open-ended, multiple choice, or fill-in-the-blank questions) and the modeling requirements in each one of them. The questionnaires contained between nine and 16 questions, which is more than the corresponding quizzes, which contained between eight and two questions.
The participants completed the assignments asynchronously, at their own time and pace, and responded to a questionnaire and a quiz in each assignment. We hypothesized that the online assignments would provide a basis for assessing the participants’ systems thinking level because they deal with complex processes that integrate different disciplines and aspects, whereas the OPM diagrams created by them during the assignments would reflect their conceptual modeling level.
Object-process methodology–OPM
Object-Process Methodology (OPM, ISO 19450:2015) is a formal MBSE language and methodology that is simple and bimodal (Crawley, 2001), and is the most researched MBSE modeling method (Dong et al., 2022). Its simplicity stems from using only three types of entities: objects—things that exist, processes—things that happen to and transform objects, and states—specific situations of objects during their lifecycles. OPM is expressed bimodally in both graphics and text. Each graphical representation (Object-Process Diagram—OPD) created in OPCloud, an OPM collaborative, cloud-based modeling environment (Dori et al., 2018), automatically generates a textual representation (Object-Process Language—OPL) (Dori, 2016), a subset of English or another natural language. Due to its simplicity and bimodality, OPM is most suitable for educational uses in addition to its applications in systems and information engineering (Dori, 1995; Lavi et al., 2019).
At the end of the learning process, the students were requested to answer a reflection questionnaire (Gibson et al., 2016) containing Likert 1–5 scale type and open-ended questions. The questionnaire was designed to capture the students’ perceptions regarding the challenges they encountered and how the learning process affected them. It included 10 Likert-type items, which were divided between two questions:
-
(1)
What was the degree of difficulty you experienced during the study of this [the following] OPM module? For this question, the response options were: 1—Very low, 2—Below Average, 3—Average, 4—Above Average, and 5—Very High.
-
(2)
What is the extent to which learning OPM has contributed to your understanding of the following topics? For this question, the options were: 1—Not at All, 2—Slightly, 3—Average, 4—To a Considerable Extent, and 5—To a Great Extent.
Table 4 presents the tools, the data collected, and their analyses.
The engineering students and engineering and science teachers were also tasked with a final assignment that included two components: (1) OPM models to assess their advanced modeling skills, assessed by the modified version of STAR (Lavi et al., 2019) and (2) textual and visual content to assess their scientific literacy via the Scientific Understanding Rubric—SUR, presented by the authors (Akiri et al., 2020). The analysis of the data generated using this tool is not included in this article, but it will serve as a basis for further studies that will focus on teachers’ ability to engage in complex systems teaching. An example of such an assignment is presented in Fig. 2, with a score of 9/10 in both conceptual modeling and scientific literacy.
Data analysis
Each of the four online assignments was scored for systems thinking level according to a predetermined scheme for the open- and closed-ended questions. In the case of closed-ended questions, the scoring scheme was incorporated into Google Forms which the students used to respond to the assignments. Each correct answer received a score of one point, and the final score was calculated as a percentage of correct answers out of the total possible score. For example, a participant who correctly answered six questions out of eight received a score of 75. Answers to open-ended questions were reviewed manually and scored by one of the authors. A complete answer received 100%, an incomplete answer received 50%, and a missing, completely incorrect, or highly uninformative answer received no score at all. The total score of each questionnaire or quiz that included open-ended questions was then determined in the same way. For example, in Quiz 1, which included only seven open-ended questions, if one incomplete answer received 0.5 points while the other six received a full score, the total score was calculated as 6.5/7 × 100 = 93 points.
Following is an example from Quiz 1, in which two different answers from two different participants are presented, one complete and the other incomplete.
The question was: Describe in your own words the system presented in the diagram (the OPM diagram and the introduction to this quiz are presented in Appendix 3).
-
A full score answer of participant #3241201: The system describes a process in which cod fish is monitored, thereby improving our knowledge of its nutritional value.
-
A half-score answer of participant #3101201: Tracking cod fish and improving the knowledge.
While the first answer was comprehensive, covered all the parts of the diagram, and hence also described the system well, the second answer did not include enough relevant causal information and was quite laconic.
The OPDs created as part of the third and fourth assignments (as detailed in Fig. 1) were scored for the purpose of evaluating the conceptual modeling performance, which was measured separately. Following is a detailed description of the conceptual modeling performance analysis.
The modeling scores were determined based on a modified version of STAR (Akiri et al., 2020; see Appendix 2)–Systems Thinking Assessment Rubric, developed by Lavi and colleagues (2019). STAR was originally designed to assess OPM models of a given engineered system or natural phenomenon as a whole. However, in the assignments included in this research, the requirements were mostly to model certain parts or aspects of the system under consideration rather than creating a complete model from scratch. Therefore, only the basic part of STAR was used to assess systems thinking through OPDs in this research. This part pertains to the function, structure, and behavior system aspects. Table 5 illustrates how we used the modified STAR to score a modeling requirement that appeared in the third assignment (as part of quiz 3), dealing with the distribution of fresh cod fish. The rest of the modeling requirements were scored using the same rubric, with slight nuances depending on the requirements of each question (minimum number of objects or subprocesses, defining the main process, etc.) The modeling scores rated by the three experts based on the modified version of STAR received 84% interrater reliability (Hallgren, 2012).
The first modeling requirement appeared in assignment U2 (U for ‘unit’) after participants had performed the preliminary assignment, U-pre, and U1, which gave them a solid basis to start modeling on their own with help of scaffolding in the form of instructions and directions. Figures 3 and 4 present two examples of OPDs created by two different participants as part of U3.
One minor correction required in the OPD in Fig. 3 is technical: according to OPM syntax, the phrasing of any process should end in a present tense verb, i.e., Land Delivering instead of Land Delivery. In Fig. 4, the clarity of the OPD is not optimal, with lines crossing each other. In addition, not all links are correct. E.g., Land Transporting and the state (a state: a situation at which an object can be) fulfilled were linked by an effect link, whereas a result link should have been used (text in bold font denotes OPM entities–objects, processes, and states).
To answer RQ2, students’ feedback was analyzed, including the Likert-scale ranking (1–5) in different aspects, and the identification of main themes in the open-ended questions, both top-down and bottom-up (Creswell & Clark, 2018).
Findings
To answer RQ1, participants’ systems thinking and modeling performance were scored for the four assignments: the first introductory assignment, U-pre, and the next three assignments, U1 to U3, which constituted the learning process itself.
Systems thinking performance
The systems thinking scores for the closed questions were determined automatically at the end of each assignment through the Google Forms output. Open-ended questions were checked and scored manually. The engineering students group’s mean score of the introductory U-pre (M = 94, SD = 7.85) was significantly higher than that of the engineering and science teacher group (M = 70, SD = 14.84), t(53) = 7.56, p < 0.001. Figure 5 presents the systems thinking mean scores of the two groups for assignments U1 to U3. An independent samples t-test was conducted, showing a significant difference in mean score for each assignment in favor of the engineering student group, albeit smaller than that of U-pre.
To evaluate each group’s learning gain in systems thinking, Hake’s normalized gain (Hake, 1998) was calculated as the ratio of the actual individual gain to the maximum possible individual gain, following the next equation:\(\mathrm{g} = <(\%Sf- \%Si)/(100-\%Si)>\)where Sf = Final individual score and Si = Initial individual score. The values range from 0 to 1. The “ < > ” enclosing the equation denotes the average of individual gains, instead of a gain of averages \((<\mathrm{g}> <\%Sf>- <\%Si>)/(100-<\%Si>)\). We preferred this form of calculation because this way individuals can be compared and not only one average value per group.
The relative improvement of the engineering students, as expressed by hake’s normalized gain, was higher than that of engineering and science teachers in systems thinking: 0.61 (SD = 0.39) compared to 0.43 (SD = 0.36), respectively. To further investigate whether this difference is significant or not, we then used an independent samples t-test to compare normalized gains. Application of inferential statistics on normalized gain values has been done in several other educational studies (e.g., de Back et al., 2021; Dori & Belcher, 2005; Molin et al., 2021), as is also customary in the case of difference score (i.e., unnormalized gain) in other pretest–posttest designs (Wright, 2006). The results demonstrated that this difference in normalized gain between the two groups is indeed significant, t(53) = 1.83, p < 0.05.
A one-way repeated measures analysis of variance (ANOVA) to detect changes in mean scores over different time points (i.e., the four assignments) revealed significant progress for both the systems engineering student and the engineering and science teacher groups, F(3, 81) = 18.52, p < 0.001, and F(3, 78) = 11.03, p < 0.001, respectively. Using post hoc tests with Bonferroni correction revealed a significant mean score increase for both research groups from ST1 to ST2, and from ST1 to ST3 (p < 0.001 for both pairwise comparisons for both groups).
Conceptual modeling performance
All OPDs the participants had produced were assessed based on the modified version of STAR, scored on a 1 to 10 scale. Four modeling requirements were incorporated into the learning process: the first and the second as part of questionnaire 2 and quiz 2 (OPM1 and OPM2, respectively), whereas the third requirement was part of questionnaire 3 (OPM3), and the last one a part of quiz 3 (OPM4), which marked the end of the learning process.
The first and last modeling requirements showed a significant difference between the groups. For OPM1 modeling, the mean score was 77.12 (SD = 14.36) for the engineering student group and 64.58 (SD = 21.07) for the engineering and science teacher group, t(40.16) = 2.437, p < 0.05. For OPM4, the mean score was 83.08 (SD = 17.78) for the engineering student group and 70.13 (SD = 18.69) for the education student group, t(47) = 2.85, p < 0.01. The difference between the two groups in modeling performance was smaller than the difference in their systems thinking. Nevertheless, as shown above, significant mean score differences in favor of the engineering student group were found for the first modeling requirements and for the last one, which was the most advanced in terms of both OPM and the complexity of the food-related process.
A one-way repeated measure ANOVA was conducted separately for each group to detect changes in mean scores over different time points. For the engineering student group, there was a statistically significant difference between at least two modeling scores, F(3, 63) = 9.26, p < 0.001. Using post hoc tests with Bonferroni correction revealed a significant mean score increase from OPM3 to OPM4. For the engineering and science teacher group, the test was not statistically significant, although the last modeling requirement, OPM4, received a higher mean score than the three previous modeling requirements.
Lastly, the difference in the modeling relative improvement, < g >, between the engineering student group (M = 0.42, SD = 0.43) and the engineering and science teacher group (M = 0.27, SD = 0.34) was insignificant in independent samples t-test.
The results demonstrate that prior knowledge (as reflected in U-pre scores) in systems thinking and modeling contributes to the effectiveness of the learning process, as indicated by both the differences between the two groups at the beginning of the learning process and the improvement over time of each group.
Participants’ perceptions
The second research question related to the participants’ perceptions regarding the contribution of the learning process and the challenges they encountered during it. To this end, the participants were asked to respond to a reflection questionnaire. Of the 55 research participants, 52 answered the open-ended questions in the reflection questionnaire, whereas the closed-ended questions had lower response rates, ranging from 39 to 40 participants per question.
When asked “To what extent has learning OPM contributed to your understanding of modeling and systems thinking?”, 85% (N = 39) of the participants had chosen 5 (To a Great Extent) or 4 (Considerably). None of them chose 1-Not at all. The least difficult aspect participants faced was ‘Practical—understanding OPL sentences’, with 78% (N = 40) ranking this aspect as 1 (very low difficulty) or 2 (Low difficulty). The reflection questionnaire included two open-ended questions: (1) Describe in about five sentences what you learned in this course and how this may benefit you in the future, and (2) Describe in about five sentences the difficulties you encountered, if any.
Analyzing the responses to the first question, two main benefits were identified: (1) improving modeling skills, and (2) developing systems thinking.
As for modeling skills, those who were already familiar with modeling in general, using languages other than OPM, indicated mainly improvement, while for those without a background in modeling, the main gain was the awareness that the field existed at all. “I have learned how to model everything in the world which will benefit me in my future career.”, and [I have learned that] “every system can be modeled, and a model is kind of a common language for different departments.” (Participant # 3132201. brackets added for clarification). Participant #2072111 stated that she has “learned about representing a problem and the logic behind it. […] There is no [only one] right model! (as was proven in practice). The OPM world is intriguing, as the units integrate, revealing more data and complicated examples.” As much as 60% of the respondents expressed the benefit of improving modeling skills in their reflections.
The following examples of statements demonstrate the perceived benefits related to systems thinking: “I have learned how to do systems thinking which I believe could benefit me in every field that I will choose to work in.” (Participant #3222201). Participant #3261201 stated that he gained a “Better understanding of how big real-world complex systems work, structure and behave.” Participant #2112111 wrote: “I have learned in this course a new method to organize my thinking about complex systems. From now on, when I face a new situation, I’ll try to understand the system components—physical and theoretical, processes and objects, agents and instruments, and so on.” This motif of organizing thinking about complex systems and developing awareness of the importance of systems thinking was repeated among one-quarter of the respondents.
In the responses to the second question, two main challenges were identified: technical issues and OPM-related ones. Using the modeling environment, OPCloud, some participants had difficulties at the beginning. Participant #5052256 wrote: “I was supposed to build models from zero without any directions or explanations except for the video tutorial. It was difficult to understand the program at first.” She also added that “understanding how to build an OPM model was a challenge at first.” This issue was raised by 29% of the respondents.
The difficulties involved in getting acquainted with OPM as a new methodology and language, along with the development of a new way of thinking, were expressed by participants in a variety of aspects. Participant # 2072111 noted: “It was difficult for me to choose the right links to build the model based on a textual description. The theoretical part sounds clear but translating [it] into a model requires deep explanation and understanding, choosing the right connections, arranging, translating a problem.” Participant #3021201 wrote that his biggest difficulty was “implementing ideas and concepts of OPM via OPD designs.” This challenge was mentioned in various manners by other participants from both research groups, a total of 31% of the respondents.
Discussion
Systems thinking and modeling are recognized by leading educational standards-setting organizations as two prominent twenty-first century skills that future citizens should possess. Forming sustainable systems thinking and modeling skills is key to handling complex systems-related issues. Talanquer (2019), among others, claimed that achieving the change in habits and skills related to systems thinking is not possible through short-term one-time intervention, but through a lengthy, continuous process. However, assessing the systems thinking level of various engineering and science education communities, along with evaluating the effectiveness of various educational tools in the acquisition of systems thinking basics is achievable, as demonstrated in this study and several other studies, (Grohs et al., 2018; Lavi & Dori, 2019; Rosenkränzer et al., 2017; Talanquer, 2019).
This research, part of the EIT Food project TRACOD—Model-Based Tracking of Cod and Other Fish Value Chain for Consumer Confidence Boosting and Food Engineers Education, aimed at examining the effect of an online learning process in the form of cross-disciplinary online assignments, on the development of systems thinking and modeling skills among engineering students and engineering and science teachers. Their perceptions regarding the contribution of the online learning process to advancing these skills, and the challenges they faced, were also examined. Analysis of participants’ responses revealed the following conclusions.
First, the engineering students’ performance in both systems thinking and modeling was better than that of their fellow engineering and science teachers, who presumably had been less exposed to courses and learning materials that could have contributed to developing their systems thinking level. Hence, this study indicates that science and engineering teachers, for the most part, have not been trained to teach systems thinking, since the threshold condition for teaching is the educators’ knowledge and understanding of the field or topic they are supposed to teach (Loewenberg Ball et al., 2008). Therefore, an online course like the one presented in our study, whether synchronous or asynchronous, is needed. Such a course should focus on developing systems thinking among teachers, and in turn, among their students.
Imparting systems thinking is a challenging process that takes time and multiple opportunities to practice, accompanied by appropriate formative assessment (Rosenkränzer et al., 2017; Talanquer, 2019 In addition, a significant part of the learning process presented in this study was dedicated to imparting a conceptual and terminological basis in systems and modeling. Therefore, it may be beneficial to add 2–3 advanced assignments (at least at the level of Assignment 4 presented in the current study) either as an additional course or as an extracurricular project. Such advanced assignments can provide crucially needed continuity to the acquired skills and render them more sustainable, with extra focus on procedural knowledge, applying what has been learned in previous assignments. The lack of basic systems-related knowledge among teachers, as suggested in this study, is in line with what has been reported in other studies, such as Szozda and colleagues (2022) in chemistry education, and Rosenkränzer and colleagues (2017) in biology and geography education. Assaraf and Orion (2005) recommended introducing the basics of systems thinking already at the elementary school level, so that middle school students (and even more so in high school) could reach a higher level of systems thinking. This way, they will not have to experience encountering systems-related ideas and terms for the first time only when they reach higher education (Riess & Mischo, 2010). The current research supports this vision, as it exposes the significant gaps between university students with a basic background in systems and those without and demonstrates the higher effectiveness of the learning process for those with the relevant background.
Second, both groups demonstrated a significant improvement in their systems thinking over time. However, the systems thinking relative improvement of the systems engineering students was significantly higher than that of the science and engineering teachers. Although the teachers who participated in the study made noticeable gains considering their relatively low starting point, the improvement among the engineering students, who had background knowledge in systems and hence also had a higher starting point, was still relatively greater. Skills and knowledge are intertwined to form domain-specific competencies, as well as more general and cross-disciplinary ones, such as systems thinking (NRC, 2012b). Lack of relevant prior knowledge may form a major barrier to understanding complex systems through systems thinking. Several systems thinking frameworks and researchers (e.g., Dugan et al., 2021; Fanta et al., 2019; Rosenkränzer, 2017), further emphasized the importance of prior knowledge for the acquisition of systems thinking. The teaching of modeling as a precursor and scaffolding for systems thinking should start from at least middle school and continue through secondary education to higher education.
Third, although the difficulty level of the online assignments increased considerably from the introductory U-pre assignment to the fourth one, U3, the gaps between the two participant groups narrowed, indicating that the engineering and science teachers were capable of catching up with their engineering student peers in a relatively short period. The reference point for the improvement of the engineering and science teachers was twofold: the corresponding performance of their engineering student peers and their own performance at the beginning of the process. Thus, although this study is quasi-experimental without a non-intervention control group, we could attribute the improvement to the learning process rather than to unknown variables.
Systems thinking skills improvement has been demonstrated in a number of other studies in both secondary and higher education that included relatively short interventions–up to one semester (Lavi & Dori, 2019; Eilam & Reisfeld, 2017; Hmelo-Silver et al., 2017; Riess & Mischo, 2010; Rosenkränzer, 2017). Yet, our study differs in that two populations with quite different academic backgrounds were studied, thus allowing insights that would not be possible with a single homogeneous research group.
The learning process presented in this study can be adapted to a variety of other content areas with relative ease, mainly due to its online asynchronous delivery and its basic structure (i.e., OPM modeling of complex systems). The course is not content-dependent but rather conceptual modeling and systems thinking skills oriented. The learning process we developed features high modularity and flexibility which are important for ICT-based learning (Lee et al., 2001). Based on the qualitative and quantitative findings of the current study, as well as those of a previous one (Akiri et al., 2020), the course effectively elevates the learners’ conceptual modeling and systems thinking skill levels.
Finally, the score differences between the two groups were less prominent in modeling than in systems thinking. This may indicate that one skill is more complex for a particular population and less for another population depending on the background and skills to which they were exposed before starting the learning process. Alternatively, it may imply that conceptual modeling is a more complex capability that involves higher-order systems thinking. Therefore, the two research groups, which did not include systems thinking professionals, may be closer in their modeling performance despite the different academic backgrounds of each group.
The finding of Kayama and colleagues (2016) that relevant prior knowledge did not show a consistent trend in participants’ performance on conceptual modeling does not contradict our findings. The former study compared first- and second-year students of the same subject, while in our study the comparison was between two groups with different backgrounds.
Do systems thinking and conceptual modeling require different abilities?
In the current study, we separated the systems thinking and conceptual modeling skills in part because of the use of different platforms for the two skills: systems thinking was assessed by analyzing answers to questions on Google Forms, while conceptual modeling was assessed by analyzing OPM diagrams that participants had created using the OPM web-based modeling platform OPCloud. As discussed in the Theoretical background section, several systems thinking-related studies and frameworks explicitly include modeling skills (Verhoeff et al., 2018) and conceptual modeling in particular as an integral part of the systems thinking skill set (e.g., Arnold & Wade, 2015, 2017; Monat & Gannon, 2015; Rosenkränzer et al., 2017). In such frameworks, modeling activities are positioned high in the skills hierarchy. This is in line with our hypothesis that conceptual modeling is a high-level form of systems thinking, which requires mastery of abilities beyond those needed to achieve proficiency in basic systems thinking. Among the key skills of systems thinking, Verhoeff and colleagues (2018) emphasized the importance of conceptual modeling practices to promote a deep conceptual understanding of complex phenomena, arguing that conceptual modeling should be regarded as a component of higher order systems thinking. This claim is supported by our study and others (Eilam & Reisfeld, 2017; Hmelo-Silver et al., 2017; Verhoeff et al., 2008).
Arnold and Wade (2017) considered systems thinking as encompassing two facets that may operate in parallel or series—gaining systemic insight and using systemic insight. This perspective may be appropriate in describing the difference between the systems thinking component and the conceptual modeling one in this study, as the latter seems to reflect a deeper and more comprehensive understanding of the system, while the former describes the acquisition of a more basic understanding of the system’s features, one that is achieved even before acquiring a specific understanding of all the details of how the system operates (Wade & Heydari, 2014).
Our findings regarding conceptual modeling coincide with those of Rosenthal and colleagues (2019). They suggested that the next step in conceptual modeling teaching and learning is to thoroughly investigate the difficulties and barriers learners of conceptual modeling face. The outcome will inform future designs of conceptual modeling learning tools. This investigation might prove crucial to the advancement of sustainable systems thinking, accepting that conceptual modeling is indeed a manifestation of higher order systems thinking (Rosenthal et al., 2019).
Perceived challenges and contributions
The second research question probed the participants’ perceptions regarding the contribution of the online learning process to their systems thinking and modeling skills and the challenges they faced while performing the online cross-disciplinary assignments. Analyzing the written responses to the online open-ended questionnaire administered at the end of the course, two main benefits emerged. One was the participants’ realization that they had become acquainted with the concept of systems thinking and that their systems thinking developed along the course. The other benefit was that for some of the participants, the notion of conceptual modeling was new, and the modeling skills of all the participants, both those who were aware of conceptual modeling and those for whom this was new, improved significantly. Challenges raised by the participants were of two kinds: methodological and technical. The methodological challenges focused around internalizing the idea underlying OPM, namely, the ability to tell objects from processes and how to link them both in a concise, expressive conceptual model to elicit ideas clearly and unambiguously. Technical problems related to the use of the OPCloud software, which, despite being user-friendly, requires proficiency to operate it productively.
Limitations and further research
One limitation of the research is that the improvement in systems thinking and modeling skills was explored during one semester. Long-term effects require more comprehensive, longitudinal research. This is especially true for thinking skills, which require the adoption of a new mindset as a habit and not just as a capability (Cottrell, 2017). Another limitation is that the study was conducted only among students from our academic institute, a top-tier research university. Further studies that will include more diverse and broad populations will enrich the body of knowledge on the research subjects and may refine or reinforce the current findings. To the best of our knowledge, at least for the last 20 years, Assaraf and Orion (2010) are the only ones to have presented a prospective longitudinal study dealing with systems thinking. They found that students remembered the learned systems-related material 6 years after, but in their case, it was not a short intervention as in our study, but a year-long learning process.
Since using OPM for conceptual modeling in the context of systems thinking in engineering and science education started to be explored methodically only in recent years (Lavi & Dori, 2019; Lavi et al., 2019), more background research remains to be done. For example, the exact nature of the relationship between systems thinking and conceptual modeling requires additional clarification: Does conceptual modeling necessarily require systems thinking, or vice versa? Is the relationship bidirectional? The exact nature of the relationship between these two skills should and would be examined in a wider follow-up study. The relative improvement of both groups in systems thinking may imply that regardless of prior knowledge in systems thinking or modeling, the OPM-based online assignments of the type presented in this research are effective in enhancing systems thinking. The effect of those assignments on conceptual modeling abilities mandates more research to gain insights that are more decisive and generalizable. One possible research direction is the use of keystroke data in computerized conceptual modeling instruction. While scoring focuses on the final product, by analyzing keystroke data we might be able to trace the process of constructing the product—the model. This approach may lead to a deeper understanding of the acquisition of conceptual modeling skills and that of OPM modeling, such as focusing on processes first, then on related objects.
Regarding the practical implications for science and engineering teachers, one step forward would be to integrate aspects of PCK (Shulman, 1987) into a course similar to the one presented in this study, for training pre- and in-service teachers not only in understanding systems but also improve their PCK for teaching systems thinking in school. Therefore, one of the future research directions is to examine the use of an OPM-based learning process in the development of PCK and not only content knowledge (CK) or pedagogical knowledge (PK) among science and engineering teachers. After a longer and more comprehensive professional development, we hope and assume that the teachers who understood systems but were unable to engage in systems teaching will be able to promote systems thinking among their students. In further research, we might be able to link PCK to systems thinking and conceptual modeling, which are needed in different fields and with different methods. Since this study was conducted, three additional OPM modeling-based assignments have been developed and validated based on the four assignments presented here, and they were applied to over 150 engineering students. The topics of these new assignments are sustainable olive oil production, sustainable wine production, and the COVID-19 pandemic. We plan to use these assignments along with the first four ones in future research projects.
Contribution
This research has both theoretical and practical contributions. The theoretical contribution is twofold. One is the insight into the complex relationships between systems thinking and conceptual modeling. The other is the testimony that four online cross-disciplinary assignments that integrate a variety of food-related topics based on OPM can successfully facilitate the development of learners’ systems thinking and modeling skills. Since these are two essential twenty-first century skills that twenty-first century citizens must be proficient with, the theoretical contribution is morphed also into the practical contribution. The development of the assignments that make up the online experience, after their validation and expansion during this study, can serve engineering and science educators in their teaching and assessment of their students. The systems thinking- and modeling-oriented online assignments that have been conceived, developed, and evaluated through this study can be used in various contexts and situations, as a stand-alone learning process or as a part of online or frontal, face-to-face courses. Regardless of the ways it is delivered, these students’ cross-disciplinary, online experiences may serve as a prime example of how a cross-disciplinary topic—food, in this case, can be presented in the framework of the systems approach to develop students’ systems thinking and modeling capabilities. Using this template as a prototype, any topic other than food, such as transportation, medicine, and industry, can be presented in the context of the systems approach and contribute to the development of learners’ systems thinking and modeling.
Since systems thinking and systems modeling skills have been recognized worldwide by key educational organizations as critical for K-12 and higher education students (Accreditation Board for Engineering and Technology, 2020.; NGSS Lead States, 2013), their teachers must master them as a precondition for their assimilation among students (Fanta et al., 2019; Hmelo-Silver et al., 2017; Rosenkränzer et al., 2017; Yoon et al., 2017). Through this research, we learned that science and engineering pre- and in-service teachers were not trained in systems thinking or conceptual modeling. Therefore, such an online course is required, be it synchronous or asynchronous, to enable the development of systems thinking among their students. Professional development and pre-service training programs that will be based on the learning process presented in this work may become key promoters for achieving such assimilation (Yoon et al., 2017). The study is significant to the science and engineering education community because systems thinking and modeling constitute key skills whose importance for all citizens—teachers, K-12 students, and college graduates—continues to grow constantly. The fact that learning online asynchronously is gaining popularity and has accelerated drastically since COVID-19 erupted (Daniel, 2020) greatly increases the significance of this study.
The content knowledge of the online assignments used for this research was not typically the expertise area of the engineering students, yet their achievements were better than those of the engineering and science teachers. This may indicate that the learning process was successfully designed to develop and evaluate the learners’ systems thinking and conceptual modeling skills rather than knowledge of food-related processes, which was nevertheless a welcome by-product.
Data Availability
Data are available on request due to privacy or other restrictions.
References
Accreditation Board for Engineering and Technology. (2020). Criteria for Accrediting Engineering Programs, 2020—2021. Retrieved January 26, 2022, from https://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2020-2021/
Akiri, E., Tal, M., Peretz, R., Dori, D., & Dori, Y. J. (2020). STEM graduate students’ systems thinking, modeling and scientific understanding-the case of food production. Applied Sciences (switzerland), 10(21), 1–23. https://doi.org/10.3390/app10217417
Allen, T., & Prosperi, P. (2016). Modeling sustainable food systems. Environmental Management, 57(5), 956–975. https://doi.org/10.1007/s00267-016-0664-8
Aristovnik, A., Keržič, D., Ravšelj, D., Tomaževič, N., & Umek, L. (2020). Impacts of the COVID-19 pandemic on life of higher education students: A global perspective. Sustainability, 12(20), 1–34. https://doi.org/10.3390/su12208438
Arnold, R. D., & Wade, J. P. (2015). A definition of systems thinking: A systems approach. Procedia Computer Science, 44, 669–678. https://doi.org/10.1016/j.procs.2015.03.050
Arnold, R. D., & Wade, J. P. (2017). A complete set of systems thinking skills. INCOSE International Symposium, 27(1), 1355–1370. https://doi.org/10.1002/j.2334-5837.2017.00433.x
Assaraf, O. B. Z., & Orion, N. (2005). Development of system thinking skills in the context of earth system education. Journal of Research in Science Teaching, 42(5), 518–560. https://doi.org/10.1002/tea.20061
Barak, M. (2022). Fostering systems thinking in the context of electronics studies. In J. Hallström & P. J. Williams (Eds.), Teaching and learning about technological systems: Philosophical (pp. 73–107). Springer Nature: Curriculum and Classroom Perspectives.
Ben-Zvi-Assaraf, O., & Orion, N. (2010). Four case studies, six years later: Developing system thinking skills in junior high school and sustaining them over time. Journal of Research in Science Teaching, 47(10), 1253–1280. https://doi.org/10.1002/tea.20383
Booth-Sweeney, L., & Sterman, J. D. (2000). Bathtub dynamics: Initial results of a systems thinking inventory. System Dynamics Review, 16(4), 249–286. https://doi.org/10.1002/sdr.198
Bracher, M. (2021). Foundations of a wisdom-cultivating pedagogy: Developing systems thinking across the university disciplines. Philosophies. https://doi.org/10.3390/philosophies6030073
Brophy, S., Klein, S., Portsmore, M., & Rogers, C. (2008). Advancing engineering education in P-12 classrooms. Journal of Engineering Education, 97(3), 369–387. https://doi.org/10.1002/j.2168-9830.2008.tb00985.x
Buchmann, R. A., Ghiran, A.-M., Döller, V., & Karagiannis, D. (2019). Conceptual modeling education as a “design problem.” Complex Systems Informatics and Modeling Quarterly, 21, 21–33. https://doi.org/10.7250/csimq.2019-21.02
Cabrera, L., Sokolow, J., & Cabrera, D. (2021). Developing and validating a measurement of systems thinking: The Systems Thinking and Metacognitive Inventory (STMI). Routledge Handbook of Systems Thinking, 1–42. https://www.researchgate.net/publication/349850449
Cabrera, D., Colosi, L., & Lobdell, C. (2008). Systems thinking. Evaluation and Program Planning, 31(3), 299–310. https://doi.org/10.1016/j.evalprogplan.2007.12.001
Care, E. (2018). Twenty-first century skills: From theory to action. In E. Care, P. Griffin, & M. Wilson (Eds.), Assessment and teaching of 21st century skills: Research and applications (pp. 3–17). Springer.
Chen, P. P. S. (1976). The entity-relationship model—toward a unified view of data. ACM Transactions on Database Systems (TODS), 1(1), 9–36. https://doi.org/10.1145/320434.320440
Comfort, K. B., & Timms, M. (2018). A twenty-first century skills lens on the common core state standards and the next generation science standards. In E. Care, P. Griffin, & M. Wilson (Eds.), Assessment and teaching of 21st century skills: Research and applications (pp. 131–144). Springer.
Cottrell, S. (2017). Critical thinking skills: Effective analysis, argument and reflection. Macmillan International Higher Education.
Crawley, E. (2001). Foreword. In D. Dori (Ed.), Object-process methodology: a holistic systems paradigm. Springer.
Creswell, J. W., & Clark, V. L. P. (2018). Designing and conducting mixed methods research (3rd ed.). NY: Sage Publications Ltd.
Daniel, S. J. (2020). Education and the COVID-19 pandemic. Prospects, 49(1–2), 91–96. https://doi.org/10.1007/s11125-020-09464-3
Davies, I., Green, P., Rosemann, M., Indulska, M., & Gallo, S. (2006). How do practitioners use conceptual modeling in practice? Data and Knowledge Engineering, 58(3), 358–380. https://doi.org/10.1016/j.datak.2005.07.007
de Back, T. T., Tinga, A. M., & Louwerse, M. M. (2021). CAVE-based immersive learning in undergraduate courses: examining the effect of group size and time of application. International Journal of Educational Technology in Higher Education. https://doi.org/10.1186/s41239-021-00288-5
Dhukaram, A. V., Sgouropoulou, C., Feldman, G., & Amini, A. (2018). Higher education provision using systems thinking approach—case studies. European Journal of Engineering Education, 43(1), 3–25. https://doi.org/10.1080/03043797.2016.1210569
Dong, M., Lu, J., Wang, G., Zheng, X., & Kiritsis, D. (2022). Model-based systems engineering papers analysis based on word cloud visualization. International Systems Conference. https://doi.org/10.1109/SysCon53536.2022.9773795
Dori, D. (1995). Object-process analysis: Maintaining the balance between system structure and behaviour. Journal of Logic and Computation, 5(2), 227–249. https://doi.org/10.1093/logcom/5.2.227
Dori, D. (2002). Object-process methodology a holistic systems paradigm. Journal of chemical information and modeling (1st ed.). Springer-Verlag.
Dori, D. (2005). Modeling knowledge with object-process methodology. In D. Swartz (Ed.), Encyclopedia of knowledge management. Idea Group.
Dori, D. (2008). Words from pictures for dual-channel processing. Communications of the ACM, 51(5), 47–52.
Dori, D. (2016). Model-based systems engineering with OPM and SysML. Springer. https://doi.org/10.1007/978-1-4939-3295-5
Dori, D., Jbara, A., Levi, N., & Wengrowicz, N. (2018). Object-process methodology, OPM ISO 19450—OPCloud and the evolution of OPM modeling tools. Systems Engineering Newsletter (PPI SyEN), 61, 6–17.
Dori, D., & Sillitto, H. (2017). What is a system? An Ontological Framework. Systems Engineering, 20(3), 207–219. https://doi.org/10.1002/sys.21383
Dori, D., Sillitto, H., Griego, R. M., McKinney, D., Arnold, E. P., Godfrey, P., Martin, J., Jackson, S., & Krob, D. (2020). System definition, system worldviews, and systemness characteristics. IEEE Systems Journal, 14(2), 1538–1548. https://doi.org/10.1109/JSYST.2019.2904116
Dori, Y. J., & Belcher, J. (2005). How does technology-enabled active learning affect undergraduate students’ understanding of electromagnetism concepts? Journal of the Learning Sciences, 14(2), 243–279. https://doi.org/10.1207/s15327809jls1402_3
Dugan, K. E., Mosyjowski, E. A., Daly, S. R., & Lattuca, L. R. (2021). Systems thinking assessments in engineering: A systematic literature review. Systems Research and Behavioral Science, 39(4), 840–866. https://doi.org/10.1002/sres.2808
Eilam, B., & Reisfeld, D. (2017). A curriculum unit for promoting complex system thinking: The case of combined system dynamics and agent based models for population growth. Journal of Advances in Education Research, 2(2), 39–60. https://doi.org/10.22606/jaer.2017.22001
Fanta, D., Braeutigam, J., & Riess, W. (2019). Fostering systems thinking in student teachers of biology and geography-an intervention study. Journal of Biological Education, 54(3), 226–244. https://doi.org/10.1080/00219266.2019.1569083
Fillieule, R. (2001). The general system theory. International encyclopedia of the social & behavioral sciences (1st ed., pp. 15418–15423). Elsevier.
Frank, M. (2006). Knowledge, abilities, cognitive characteristics and behavioral competences of engineers with high capacity for engineering systems thinking (CEST). Systems Engineering, 9(2), 91–103. https://doi.org/10.1002/sys.20048
Frank, M. (2010). Assessing the interest for systems engineering positions and other engineering positions' required capacity for engineering systems thinking (CEST). Systems Engineering, 13(2), 161–174. https://doi.org/10.1002/sys.20140
Freeman, L. A., & Urbaczewski, A. (2020). Using concept maps to assess students’ understanding of information systems. Journal of Information Systems Education, 12(1), 1.
Gero, A., Shekh-Abed, A., & Hazzan, O. (2021). Interrelations between systems thinking and abstract thinking: The case of high-school electronics students. European Journal of Engineering Education, 46(5), 735–749. https://doi.org/10.1080/03043797.2020.1854186
Gero, A., & Shlomo, I. (2021). Promoting systems thinking in two-year technology students: An interdisciplinary course on medical ultrasound systems. International Journal of Engineering Education, 37(2), 564–572.
Gero, A., & Zach, E. (2014). High school programme in electro-optics: A case study on interdisciplinary learning and systems thinking. International Journal of Engineering Education, 30(5), 1190–1199.
Gharajedaghi, J. (2011). Systems thinking: Managing chaos and complexity: A platform for designing business architecture (2nd ed.). Elsevier.
Gibson, A., Kitto, K., & Bruza, P. (2016). Towards the discovery of learner metacognition from reflective writing. Journal of Learning Analytics, 3(2), 22–36. https://doi.org/10.18608/jla.2016.32.3
Gilissen, M. G. R., Knippels, M. C. P. J., Verhoeff, R. P., & van Joolingen, W. R. (2020). Teachers’ and educators’ perspectives on systems thinking and its implementation in Dutch biology education. Journal of Biological Education, 54(5), 485–496. https://doi.org/10.1080/00219266.2019.1609564
Gillett-Swan, J. (2017). The challenges of online learning: Supporting and engaging the isolated learner. Journal of Learning Design, 10(1), 20. https://doi.org/10.5204/jld.v9i3.293
Golick, D., Dauer, J., Lynch, L., & Ingram, E. (2018). A framework for pollination systems thinking and conservation. Environmental Education Research, 24(8), 1143–1158. https://doi.org/10.1080/13504622.2017.1349878
Grohs, J. R., Kirk, G. R., Soledad, M. M., & Knight, D. B. (2018). Assessing systems thinking: A tool to measure complex reasoning through ill-structured problems. Thinking Skills and Creativity, 28, 110–130. https://doi.org/10.1016/j.tsc.2018.03.003
Guo, S. (2020). Synchronous versus asynchronous online teaching of physics during the COVID-19 pandemic. Physics Education. https://doi.org/10.1088/1361-6552/aba1c5
Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American journal of Physics, 66(1), 64–74. https://doi.org/10.1119/1.18809
Hallgren, K. A. (2012). Computing inter-rater reliability for observational data: An overview and tutorial. Tutorials in quantitative methods for psychology, 8(1), 23. https://doi.org/10.20982/tqmp.08.1.p023
Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53. https://doi.org/10.1016/j.compedu.2015.09.005
Hmelo-Silver, C. E., Jordan, R., Eberbach, C., & Sinha, S. (2017). Systems learning with a conceptual representation: A quasi-experimental study. Instructional Science, 45(1), 53–72. https://doi.org/10.1007/s11251-016-9392-y
Hofkirchner, W., & Schafranek, M. (2011). General system theory. Philosophy of complex systems (pp. 177–194). Elsevier.
Hrastinski, S. (2008). Asynchronous and synchronous e-learning. Educause quarterly, 31(4), 51–55. https://doi.org/10.1080/00071667608416307
Hung, W. (2008). Enhancing systems-thinking skills with modelling. British Journal of Educational Technology, 39(6), 1099–1120. https://doi.org/10.1111/j.1467-8535.2007.00791.x
Iivari, N., Sharma, S., & Ventä-Olkkonen, L. (2020). Digital transformation of everyday life—How COVID-19 pandemic transformed the basic education of the young generation and why information management research should care? International Journal of Information Management, 55, 102183. https://doi.org/10.1016/j.ijinfomgt.2020.102183
Jacobson, M. J., Kapur, M., So, H. J., & Lee, J. (2011). The ontologies of complexity and learning about complex systems. Instructional Science, 39(5), 763–783. https://doi.org/10.1007/s11251-010-9147-0
Jagustović, R., Zougmoré, R. B., Kessler, A., Ritsema, C. J., Keesstra, S., & Reynolds, M. (2019). Contribution of systems thinking and complex adaptive system attributes to sustainable food production: Example from a climate-smart village. Agricultural Systems, 171, 65–75. https://doi.org/10.1016/j.agsy.2018.12.008
Kayama, M., Ogata, S., Asano, D. K., & Hashimoto, M. (2016). Educational criteria for evaluating simple class diagrams made by novices for conceptual modeling. Proceedings of the 13th International Conference on Cognition and Exploratory Learning in the Digital Age, CELDA 2016, Celda, 319–323
Klaassen, R. G. (2018). Interdisciplinary education: A case study. European Journal of Engineering Education, 43(6), 842–859. https://doi.org/10.1080/03043797.2018.1442417
Koral-Kordova, S., Frank, M., & Nissel Miller, A. (2018). Systems thinking education—Seeing the forest through the trees. Systems, 6(3), 29. https://doi.org/10.3390/systems6030029
Kramer, J. (2007). Is abstraction the key to computing? Communications of the ACM, 50(4), 37–42.
Lattuca, L. R., Voight, L. J., & Fath, K. Q. (2004). Does interdisciplinarity promote learning ? Theoretical support and researchable questions. Review of Higher Education, 28(1), 23–48. https://doi.org/10.1353/rhe.2004.0028
Lavi, R., & Dori, Y. J. (2019). Systems thinking of pre- and in-service science and engineering teachers. International Journal of Science Education, 41(2), 248–279. https://doi.org/10.1080/09500693.2018.1548788
Lavi, R., Dori, Y. J., Wengrowicz, N., & Dori, D. (2019). Model-based systems thinking: Assessing engineering student teams. IEEE Transactions on Education, 63(1), 39–47. https://doi.org/10.1109/TE.2019.2948807
Lee, J., Hong, N. L., & Ling, N. L. (2001). An analysis of students’ preparation for the virtual learning environment. Internet and Higher Education, 4(3–4), 231–242. https://doi.org/10.1016/S1096-7516(01)00063-X
Lee, T. D., Gail Jones, M., & Chesnutt, K. (2019). Teaching systems thinking in the context of the water cycle. Research in Science Education, 49(1), 137–172. https://doi.org/10.1007/s11165-017-9613-7
Lin, X., & Gao, L. (2020). Students’ sense of community and perspectives of taking synchronous and asynchronous online courses. Asian Journal of Distance Education, 15(1), 2020.
Loewenberg Ball, D., Thames, M. H., & Phelps, G. (2008). Content knowledge for teaching: What makes it special? Journal of Teacher Education, 59(5), 389–407. https://doi.org/10.1177/0022487108324554
Mahaffy, P. G., Brush, E. J., Haack, J. A., & Ho, F. M. (2018). Journal of chemical education call for papers—Special issue on reimagining chemistry education: Systems thinking, and green and sustainable chemistry. Journal of Chemical Education, 95(10), 1689–1691. doi:https://doi.org/10.1021/acs.jchemed.8b00764
Max-Neef, M. A. (2005). Foundations of transdisciplinarity. Ecological Economics, 53(1), 5–16. https://doi.org/10.1016/j.ecolecon.2005.01.014
Mayr, H. C., & Thalheim, B. (2021). The triptych of conceptual modeling: A framework for a better understanding of conceptual modeling. Software and Systems Modeling, 20(1), 7–24. https://doi.org/10.1007/s10270-020-00836-z
Molin, F., Haelermans, C., Cabus, S., & Groot, W. (2021). Do feedback strategies improve students’ learning gain?-Results of a randomized experiment using polling technology in physics classrooms. Computers and Education, 175, 104339. https://doi.org/10.1016/j.compedu.2021.104339
Monat, J. P., & Gannon, T. F. (2015). What is systems thinking? A review of selected literature plus recommendations. American Journal of Systems Science, 4(1), 11–26. https://doi.org/10.5923/j.ajss.20150401.02
Mylopoulos, J. (1992). Conceptual modeling and telos. In P. Loucopoulos & R. Zicari (Eds.), Conceptual modeling, databases, and case: An integrated view of information systems development, chap 2 (pp. 49–68). Wiley.
National Research Council (NRC). (2012a). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. The National Academies Press.
National Research Council (NRC). (2012b). Education for life and work: Developing transferable knowledge and skills in the 21st century. The National Academies Press.
NGSS Lead States. (2013). Next generation science standards: For states, by states. National Research Council. https://doi.org/10.17226/18290
Novak, J. D., & Gowin, D. B. (1984). Learning how to learn. Cambridge University.
OECD. (2012). Connected minds technology and today’s learners. Educational research and innovation. OECD Publishing.
Parrish, C. W., Williams, D. S., & Estis, J. M. (2021). Leveraging synchronous engagement and asynchronous flexibility within an integrated online model for team-based learning. Journal of Educators Online, 18(2), 1.
Pleasants, J., & Olson, J. K. (2019). What is engineering? Elaborating the nature of engineering for K-12 education. Science Education, 103(1), 145–166. https://doi.org/10.1002/sce.21483
Rates, C. A., Mulvey, B. K., Chiu, J. L., & Stenger, K. (2022). Examining ontological and self-monitoring scaffolding to improve complex systems thinking with a participatory simulation. Instructional Science, 50(2), 199–221. https://doi.org/10.1007/s11251-021-09573-2
Riess, W., & Mischo, C. (2010). Promoting systems thinking through biology lessons. International Journal of Science Education, 32(6), 705–725. https://doi.org/10.1080/09500690902769946
Robinson, C. C., & Hullinger, H. (2008). New benchmarks in higher education: Student engagement in online learning. Journal of Education for Business, 84(2), 101–109. https://doi.org/10.3200/JOEB.84.2.101-109
Rosenkränzer, F., Hörsch, C., Schuler, S., & Riess, W. (2017). Student teachers’ pedagogical content knowledge for teaching systems thinking: Effects of different interventions. International Journal of Science Education, 39(14), 1932–1951. https://doi.org/10.1080/09500693.2017.136260
Rosenkränzer, F., Kramer, T., Hörsch, C., Schuler, S., & Rieß, W. (2016). Promoting student teachers’ content related knowledge in teaching systems thinking: Measuring effects of an intervention through evaluating a videotaped lesson. Higher Education Studies, 6(4), 156–169.
Rosenthal, K., Ternes, B., & Strecker, S. (2019). Learning conceptual modeling: Structuring overview, research themes and paths for future research. ECIS.
Schleicher, A. (2019). PISA 2018: Insights and interpretations. OECD Publishing.
Schraw, G. (2007). The use of computer-based environments for understanding and improving self-regulation. Metacognition and Learning, 2(2–3), 169–176. https://doi.org/10.1007/s11409-007-9015-8
Schuler, S., Fanta, D., Rosenkraenzer, F., & Riess, W. (2018). Systems thinking within the scope of education for sustainable development (ESD)–a heuristic competence model as a basis for (science) teacher education. Journal of Geography in Higher Education, 42(2), 192–204. https://doi.org/10.1080/03098265.2017.1339264
Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 1–23.
Spiertz, H. (2012). Avenues to meet food security. The role of agronomy on solving complexity in food production and resource use. European Journal of Agronomy, 43(1), 1–8. https://doi.org/10.1016/j.eja.2012.04.004
Stave, K., & Hopper, M. (2007). What Constitutes Systems Thinking? A Proposed Taxonomy. In 25th International Conference of the System Dynamics Society. http://www.systemdynamics.org/conferences/2007/proceed/index.htm
Stenger, K., Chiu, J., & Fick, S. (2021). “Adding Stuff From Other People”: How Peer Comparison Influences Conceptual Modeling in Precollege Engineering Contexts. In 2021 ASEE Virtual Annual Conference Content Access Proceedings. https://www.researchgate.net/publication/352131934
Streiling, S., Hörsch, C., & Rieß, W. (2021). Effects of teacher training in systems thinking on biology students—An intervention study. Sustainability, 13(14), 7631. https://doi.org/10.3390/su13147631
Summerton, L., Clark, J. H., Hurst, G. A., Ball, P. D., Rylott, E. L., Carslaw, N., Creasey, J., Murray, J., Whitford, J., Dobson, B., Sneddon, H. F., Ross, J., Metcalf, P., & McElroy, C. R. (2019). Industry-informed workshops to develop graduate skill sets in the circular economy using systems thinking. Journal of Chemical Education, 96(12), 2959. https://doi.org/10.1021/ACS.JCHEMED.9B00257
Szozda, A. R., Bruyere, K., Lee, H., Mahaffy, P. G., & Flynn, A. B. (2022). Investigating educators’ perspectives toward systems thinking in chemistry education from international contexts. Journal of Chemical Education. https://doi.org/10.1021/acs.jchemed.2c00138
Talanquer, V. (2019). Some insights into assessing chemical systems thinking. Journal of Chemical Education, 96(12), 2918–2925. https://doi.org/10.1021/acs.jchemed.9b00218
Tripto, J., Assaraf, O. B. Z., & Amit, M. (2013). Mapping what they know: Concept maps as an effective tool for assessing students’ systems thinking. American Journal of Operations Research, 3(1), 245–258. https://doi.org/10.4236/ajor.2013.31A022
Turnham, E. J. A., Braun, D. A., & Wolpert, D. M. (2012). Facilitation of learning induced by both random and gradual visuomotor task variation. Journal of Neurophysiology, 107(4), 1111–1122. https://doi.org/10.1152/jn.00635.2011
Verhoeff, R. P., Knippels, M. C. P., Gilissen, M. G., & Boersma, K. T. (2018). The theoretical nature of systems thinking. Perspectives on systems thinking in biology education. Frontiers in Education. https://doi.org/10.3389/feduc.2018.00040
Verhoeff, R. P., Waarlo, A. J., & Boersma, K. T. (2008). Systems modelling and the development of coherent understanding of cell biology. International Journal of Science Education, 30(4), 543–568. https://doi.org/10.1080/09500690701237780
von Bertalanffy, L. (1968). General system theory: Foundations, development. Braziller.
Wade, J., & Heydari, B. (2014). Complexity: Definition and reduction techniques some simple thoughts on complex systems. In Proceedings of the Poster Workshop at the 2014 Complex Systems Design & Management International Conference, pp. 213–226
Wand, Y., & Weber, R. (2002). Research commentary: Information systems and conceptual modeling—a research agenda. Information Systems Research, 13(4), 363–376.
Whitehead, N. P., Scherer, W. T., & Smith, M. C. (2015). Systems thinking about systems thinking: A proposal for a common language. IEEE Systems Journal, 9(4), 1117–1128. https://doi.org/10.1109/JSYST.2014.2332494
Wilensky, U., & Resnick, M. (1999). Thinking in levels: A dynamic systems approach to making sense of the world. Journal of Science Education and Technology, 8(1), 3–19. https://doi.org/10.1023/A:1009421303064
Winberg, C. (2008). Teaching engineering/engineering teaching: Interdisciplinary collaboration and the construction of academic identities. Teaching in Higher Education, 13(3), 353–367. https://doi.org/10.1080/13562510802045394
Wright, D. B. (2006). Comparing groups in a before-after design: When t-test and ANCOVA produce different results. The British Psychological Society, 76, 663–675.
Yoon, S. A., Anderson, E., Koehler-Yom, J., Evans, C., Park, M., Sheldon, J., Schoenfeld, I., Wendel, D., Scheintaub, H., & Klopfer, E. (2017). Teaching about complex systems is no simple matter: Building effective professional development for computer-supported complex systems instruction. Instructional Science, 45(1), 99–121. https://doi.org/10.1007/s11251-016-9388-7
York, S., & Orgill, M. (2020). ChEMIST table: a tool for designing or modifying instruction for a systems thinking approach in chemistry education. Journal of Chemical Education, 97(8), 2114–2129. https://doi.org/10.1021/acs.jchemed.0c00382
Funding
This work was supported by the Gordon Center for Systems Engineering at the Technion—Israel Institute of Technology, under Grant 2029857; and EIT Food’s project TRACOD under Grant 2030366.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Disclosure
The authors report there are no competing interests to declare.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1: Description of mini-course assignments 1-3 with examples
Appendix 2: Modified Systems Thinking Assessment Rubric (STAR*) for top-level OPM diagrams
Aspect | Attribute | Expected Implementation of the Attribute | Scoring (0–2 or 0–1) | |||
---|---|---|---|---|---|---|
System Aspect | ||||||
Function | A1-Intended Purpose | Beneficiary and benefit are linked with the correct link (Exhibition-Characterization), and both are phrased correctly according to the context | Both beneficiary and benefit are absent—zero points. Only one of them (beneficiary/benefit) is used or both without a correct link—one point. Both beneficiary and benefit are used and linked, but not accurately phrased—two points. Both beneficiary and benefit are correctly used, linked, and phrased—three points | |||
A2-Main Function | Exactly one main process, which transforms at least one object, all of them phrased correctly according to the context. For SD1, At least three sub-processes, with the same specification as above | No main process, or the main process which is irrelevant to the context—zero points. The main process is correct but transforms no object(s) or is wrongly phrased—one point. The main process transforms at least one object, phrased, and linked correctly—two points | ||||
Structure | A3-Structural relations | Correct use of at least two out of four kinds of links between objects or between processes | No links- zero points. One link or more—one point | |||
A4-Level of detail | At least four objects are directly linked to the main process to allow sufficient information | Less than four objects—zero points. Four objects or more—one point | ||||
Behavior | A5-Procedural relations | Correct use of at least two procedural links between objects and processes | Less than two links—zero points. Two links or more—one point | |||
Model Aspect | ||||||
Clarity | A6-Model readability | The layout of all the model diagrams is organized to facilitate its understanding | 1-links do not cross things, things do not occlude each other, 2- minimal links cross each other, and 3- entity (object, process, state) text is complete and words are not split At least one violation—zero points. All fulfilled—one point | |||
OPL | A7-OPL main process procedural sentences | The beneficiary is linked with an agent link, e.g., “Winemaker handles Harvesting.” The operand is linked with a correct result/effect link, e.g., “Bread Making yields Bread Loaf”, or: “Harvesting changes status of Grape from on tree to picked.” An instrument or consumption link used correctly, e.g., “Bread Making requires Mixing Machine”, and "Bread Making consumes Flour and Water” | No more than one of the three sentences is present or no OPL is attached at all—zero points. Two or three sentences (out of three) are present—one point |
Appendix 3: Quiz 1—The OPD that forms the background for the quiz questions, with its introductory text
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Peretz, R., Tal, M., Akiri, E. et al. Fostering engineering and science students’ and teachers’ systems thinking and conceptual modeling skills. Instr Sci 51, 509–543 (2023). https://doi.org/10.1007/s11251-023-09625-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11251-023-09625-9