Keywords

4.1 Introduction

The sciences involved gathering, manipulation, storage, retrieval, and classification of recorded information. It is the alteration (processing) of the information detectable in any way by an observer. As such, it is a mechanism that explains all that is occurring (changes) in the world, from dropping a rock (a shift in position) to printing a text file from a digital computing device. In the latter case, the form of presentation of that text file (from bytes to glyphs) is changed by an information processor (printer).

Information processing is the process of changing or converting information into a meaningful form. Information is a processed, organized, or classified data that is useful for the receiver. Information is the processed data that may be used “as is” or may be put to use along with more data or information. The receiver of information takes actions and decisions based on the information received. The collected data must be processed to get meaning out of it, and this meaning is obtained in the form of information.

4.2 The Emergence of Information Processing

Information processing is a model for human thinking and learning, and it is a part of the resurgence of cognitive perspectives of learning. The cognitive perspective asserts that complex mental states affect human learning and behavior, that such mental states can be scientifically investigated. Computers, which process information, include the internal states that affect processing. Computers, therefore, provided a model for possible human mental states that provided researchers with clues and direction for understanding human thinking and learning as information processing. Overall, information-processing models helped in re-establish the mental processes that cannot be directly observed as a legitimate area of scientific research.

4.3 Theories of Information Processing

4.3.1 Levels of Processing

One of the first alternatives to the stage theory was developed by Craik and Lockhart (1972) and labeled the levels of processing model. Specifically, the levels of processing theory hold that memory is not three-staged which separates it immediately from the stage theory model. Craik and Lockhart argue that stimulus information is developed at several levels concurrently (not serially) depending on features, concentration, and meaningfulness. The innovative information does not have to enter in any specific order, and it does not have to pass through a prescribed channel. They further contend that the more deeply information is processed, the more that will be remembered (Kearsley, 2001b). This model was a precursor to the development of schema theory, discussed below. The two are consistent in that they agree that “the more connections to a single idea or concept, the more likely it is to be remembered” (Huitt, 2000).

4.3.2 Dual Coding Theory

As mentioned previously, another theory in the information processing debate is Paivio’s work in dual coding (Clark & Paivio, 1991). This theory gives equal significance to both verbal and nonverbal processing and suggests that there are two separate systems for processing these types of information. Images—mental images—are processed by one system, and logo-gens—verbal entities, chunks, or propositions—are processed by a different system. According to Kearsley (2001a), Paivio believes that human cognition is unique in that it has become specialized for dealing simultaneously with language and with nonverbal objects and actions. Moreover, the language system is peculiar in that it deals directly with linguistic input and output (in the form of speech or writing), while at the same time serving a symbolic function concerning nonverbal objects, events, and behaviors. Any representational theory must accommodate this dual functionality. Further, Paivio suggests that there are three separate types of processing and communication between these two subsystems: representational, referential, and associative. Representational processing is the direct activation of one system or the other; referential is the activation of one sub-system by the other; and, associative is activation within the same subsystem without the interaction of the other.

4.3.3 Schema Theory, Parallel Distributed Processing, and Connectionist Models

Rumelhart (1980), working in conjunction with others, developed the schema theory of information processing and memory. He suggested that a schema is a data structure for representing generic concepts stored in memory.

There are five key components to this view of memory and processing in relation to schema.

  1. 1.

    It is an organized structure that exists in memory and is the sum of all gained knowledge.

  2. 2.

    It exists at a higher level, or abstraction, than immediate experience.

  3. 3.

    Its concepts are linked by propositions (verbal constructs).

  4. 4.

    It is dynamic.

  5. 5.

    It provides a context or structure for new information (Winn & Snyder, 2001).

This model is sometimes called the connectionist model or theory. Huitt (2000) explains that “This model emphasizes the fact that information is stored in multiple locations throughout the brain in the form of networks of connections.” This model is explicitly different from previous ones in that it is not founded on the belief in a serial processing description. Rather, the connections between information are key, not the order in which connections are made. Rumelhart later worked with McClelland and the Parallel Distributed Processing Research Group (McClelland & Rumelhart, 1981, 1986; Rumelhart & McClelland, 1986) to expand his initial work and connectionist theories. In this enhanced model, it is still proposed that the units of memory are connections rather than any concrete representation of previous information. The latter model goes further, however, saying that the activation of the connections is the knowledge unit. According to Driscoll (2001), there are many advantages to this model. She says that it accounts for the incremental nature of learning is dynamic, incorporates goals of learning, and has the potential to explain cognitive development.

4.4 Types of Information Processing

Information can be processed in two ways.

4.4.1 Natural Information Processing System

Specific mechanisms for processing the information can be created in nature. Like all structures for the processing of information, their purpose is to coordinate information concerning the operation of organizations implemented by a program. Natural systems for processing information control the behaviors of natural beings, such as living organisms. There are many options to define the underlying logic of natural information processing processes, but we will concentrate on five basic principles in this chapter (see Table 4.1) to show how they relate to both human cognition and natural selection evolution. Some research evidence suggested local and global information processing may help on decision making under uncertainty (Mohanty & Suar, 2013a, b).

Table 4.1 Principles of general information processing system

4.5 Technical Information Processing System

Since the introduction of the term “cybernetics” by Norbert Wiener (1948), computer science and ICT (also known as Information Processing Technology, IPT in this chapter) developed in an unpredictable and unimaginable manner, a phenomenon also known as the information revolution (Porter & Read, 1998) and recently also known as the “second information revolution” underlining the remarkable growth of online communication. The changes in communication processing situations that have occurred over the past decade are often referred to as Web 2.0 (O’Reilly, 2005), a buzzword that reflects technological and relational innovations. Finally, a consumer can communicate with multiple devices, for example, to exchange emails, apart from personal computers (PCs), one can use a mobile phone, a PAD (personal digital assistant), or a smartphone (e.g., an iPhone) that contains all functionalities, access to websites and a TV set may also be acceptable, etc.

From the online communication scenario, two major themes emerge:

  1. 1.

    Integration of communication tools and functionalities into the same interface for the website

  2. 2.

    App fusion to reach these networks, usually social network websites

4.5.1 Stages of Information Processing Cycle

Information processing is a sequence of events consisting of input, processing, storage, and output. To understand more about what is information processing cycle, it is a good idea to study about data processing cycle also. These events are similar to the case of the data processing cycle. For a computer to perform useful work, it has to receive instructions and data from the outside world. The computer receives data and instructions during the INPUT stage of the information processing cycle. Useful information results when appropriate instructions are applied to data. Applying instructions to data takes place during the PROCESSING stage of the information processing cycle. To avoid having to re-enter data and instructions or reprocess information, computers can save information. Saving information on a computer occurs during the STORAGE phase of the information processing cycle. Saving information on a computer occurs during the STORAGE phase of the information processing cycle. This is followed by the result in the OUTPUT stage. Computer Processing Cycle is a similar process with similar steps by which data is fed to a computer.

Input

  1. 1.

    Entering data into the computer.

  2. 2.

    Feeding the collected raw data in the cycle for processing. This is the raw data that is supplied for processing and obtaining information.

  3. 3.

    Input can be done by utilizing various devices such as keyboards, mice, flatbed scanners, barcode readers, joysticks, digital data tablets (for graphics drawing), and electronic cash registers.

Processing

  1. 1.

    Performing operations on the data.

  2. 2.

    Once the input is provided the raw data is processed by a suitable or selected processing method. This is the most crucial step as it allows for the processed data in the form of output which will be used further.

  3. 3.

    Processing is usually done by CPU (central processing unit) in a computer. CPU is the crucial component for getting the operations done.

Storage

  1. 1.

    Saving data in a soft/physical form.

  2. 2.

    This is the outcome, and the raw data provided in the first stage is now “processed,” and the data is useful and provides information and no longer called data.

  3. 3.

    Storage can be done on an external hard disk, inbuilt hard disk, pen drives, micro SD cards, compact disks, or even in registers.

Output

  1. 1.

    Results obtained, i.e., information.

  2. 2.

    This is the outcome, and the raw data provided in the first stage is now “processed,” and the data is useful and provides information and no longer called data. This might be further used for data visualization.

  3. 3.

    This can be used as it is or used for further processing along with more data (Fig. 4.1).

Fig. 4.1
A cyclic diagram of the information processing cycle. The data are input, processing, storage, and output.

Information processing cycle

4.6 Memory

One of the primary areas of cognition studied by researches is memory. There are many hypotheses and suggestions as to how this integration occurs, and many new theories have built upon established beliefs in this area. Currently, there is widespread consensus on several aspects of information processing; however, there are many dissentions about specifics on how the brain codes or manipulates information as it is stored in memory. Schacter and Tulving (as cited in Driscoll, 2001) state that, a memory system is defined in terms of its brain mechanisms, the kind of information it processes, and the principles of its operation. This suggests that memory is the combination of all mental experiences. In this light, memory is a built store that must be accessed in some way for effective recall or retrieval to occur. It is premised on the belief that memory is a multifaceted, if not multistaged, system of connections and representations that encompass a lifetime’s accumulation of perceptions. Eliasmith (2001) defined that memory is the “general ability, or faculty, that enables us to interpret the perceptual world to help organize responses to changes that take place in the world.” It is implied by this definition that there must be a tangible structure in which to incorporate new stimuli into memory. The form of this structure has been the source of much debate, and there seems to be no absolute agreement on what shape a memory structure takes, but there are many theories on what constitutes both the memory structure and the knowledge unit.

4.6.1 The Stage Model

Traditionally, the most widely used model of information processing is the stage theory model, based on the work of Atkinson and Shiffrin (1968). The key elements of this model are that it views learning and memory as discontinuous and multistaged. It is hypothesized that as new information is taken in, it is in some way manipulated before it is stored. The stage theory model recognizes three types or stages of memory: sensory memory, short-term or working memory, and long-term memory (Table 4.2 and Fig. 4.2).

Table 4.2 The three parameters of short- and long-term memories
Fig. 4.2
A flow diagram depicts the 3-stage model of memory. Sensory memories, short-term memory, and long-term memory. Each stage contains an explanation.

Three-stage model of memory

4.6.2 Sensory Memory

Sensory memory represents the initial stage of stimuli perception. It is associated with the senses, and there seems to be a separate section for each type of sensual perception, each with its limitations and devices. Stimuli that are not sensed cannot be further processed and will never become part of the memory store. This is not to say that only stimuli that are consciously perceived are stored; on the contrary, everyone takes in and perceives stimuli almost continuously. It is hypothesized, though, that perceptions that are not transferred into a higher stage will not be incorporated into memory that can be recalled. The transfer of new information quickly to the next stage of processing is of critical importance, and sensory memory acts as a portal for all information that is to become part of memory. This stage of memory is temporally limited which means that information stored here begins to decay rapidly if not transferred to the next stage. This occurs in as little as 0.5 s for visual stimuli and 3 s for auditory stimuli. There are many ways to ensure transfer and many methods for facilitating that transfer. To this end, attention and automaticity are the two major influences on sensory memory, and much work has been done to understand the impact of each on information processing. Attention is defined by Suthers (1996) as the “limitations in our perceptual processing and response generation: to attend to one this is to not attend to others.” To attend to a stimulus is to focus on it while consciously attempting to ignore other stimuli, but it is not exclusive of these competing others. Treisman (as cited in Driscoll, 2001) “showed, however, that attention is not an all-or-nothing proposition and suggested that it serves to attenuate, or tune out, stimulation.” Attention does facilitate the integration and transfer of the information being attended, but it is impacted by many factors including the meaningfulness of the new stimulus to the learner, the similarity between competing ideas or stimuli, the complexity of the new information, and the physical ability of the person to attend. Automaticity is almost the exact opposite of attention. Driscoll (2001) says that “When tasks are over learned or sources of information become habitual, to the extent that their attention requirements are minimal, automaticity has occurred.” Automaticity allows attention to be redirected to other information or stimuli and allows for the ability of multitasking without distracting totally from the acquisition of new information. There are several suggested models of how new stimuli are recognized in sensory memory and how each deals with pattern recognition. The matching of new stimuli to the existing memory structures is a crucial factor in the acquisition of new knowledge. If new information is not brought into memory in a meaningful way, it will not be stored as memory. Therefore, the understanding of the patterns by which this information is represented is critical to the proper introduction of new information. Driscoll (2001) says that pattern recognition is “the process whereby environmental stimuli are recognized as exemplars of concepts and principles already in memory.” She discusses three models of pattern recognition: template matching, the prototype model, and feature analysis. The template matching model holds that there are exact representations of previous stimuli trapped in the mind. Pattern recognition, then, occurs by matching input with a specific, perfect specimen stored in memory. This model seems to fall short because of the vast numbers of templates that would have to exist in memory for any one type of entity and because it does not account for imperfect stimuli or imperfect templates. The second pattern recognition model is the prototype. This model suggests that the stored unit is a generalized or abstracted form of the knowledge unit, and pattern recognition is based on a comparison of the input to the prototype. If a close match is established, new information can be accepted as the existing class. These two models are very similar in that they each attempt to match incoming information with a whole picture stored in memory. This holistic comparison differentiates them from the third model, feature analysis. In this system, incoming information is judged based on characteristics rather than a whole idea. Individual characteristics are picked out and then grouped to label the new stimulus as “X.” The major difference is that these two models seem to work in opposite directions.

4.6.3 Short-Term or Working Memory

The second stage of information processing is the working or short-term memory. This stage is often viewed as active or conscious memory because it is the part of memory that is being actively processed while new information is being taken in. Short-term memory has a very limited capacity, and unrehearsed information will begin to be lost from it within 15–30 s if other action is not taken. Two main ways are effective in processing information while it is in short-term memory. Rote or maintenance rehearsal is the first but less desirable of these methods. This type of rehearsal is intended only to keep the information until it can be processed further. It consists mainly of some sort of repetition of the new information, and if it is not processed further, it will be lost. Studies on the limitations of working memory have revealed a specific number of units that the mind can process at any given time, and it is now generally accepted that 5 + 2 is the maximum number of stimuli that can be processed at once. There are several types of activities that one can perform to encode new information, but the importance of encoding cannot be overstated. Maintenance rehearsal schemes can be employed to keep information in short-term memory, but more complex elaboration is necessary to make the transfer to long-term memory. It is necessary for new information to somehow be incorporated into the memory structure in order for it to be retained. There are many suggested models for encoding, but there are three ways in which retention occurs. A stimulus can be an almost exact match with existing structures in which case it would be simply added to the mental representation and no change would be made to the structure except its addition. If the new stimulus does not exactly match the existing structure, the structure itself would be adapted to allow for additional characteristics or definitions in which case there would be a fundamental change to the existing structure, which would broaden the defining structures. Finally, if the new stimulus were vastly different from any existing structure, a new one would be created in memory. This new structure could in some way be linked to relevant structures, but it would stand alone as a new unit. At any rate, the incoming information must be acted on and through existing structures and incorporated into those systems in some way for acquisition to occur. The processing of this new stimulus takes place in short-term memory, and the body with which the information is worked is the long-term memory. The implications of this research are clear. If learning—relatively permanently change—is to take place, new information must be transferred into long-term memory. Therefore, repetition and maintenance rehearsal are not sufficient to produce a lasting effect. This has great relevance to instruction and teaching, for if the aim of education is learning, information must be presented in such a way that it can be incorporated into the memory structure. Long-term memory: As discussed with short-term memory, long-term memory houses all previous perceptions, knowledge, and information learned by an individual, but it is not a static file system that is used only for information retrieval. Abbot (2002) suggests that long-term memory “is that more permanent store in which information can reside in a dormant state—out of mind and unused—until you fetch it back into consciousness” (p. 1). In order to incorporate new information, long-term memory must be in communication with short-term memory and must be dynamic. There are several categories of long-term memory, and there are many suggestions as to how memory units are represented in the mind. While it seems that it might be sufficient to understand simply that some individual units and structures exist in long-term memory, the specific way or ways that information is stored offer extremely important information. If the knowledge unit is pictorial rather than verbal, for example, it would seem to make sense that images would be more easily and readily stored in memory. If the reverse were true, the information should be presented in verbal constructs. This oversimplifies the problem, but it is this question that is at the core of the controversy over memory storage structures. There are two divisions at issue in the discussion of long-term memory: the types of long-term memory and the type of knowledge unit stored in long-term memory. Organizations of long-term memory: Today, cognitive psychologists believe that there are at least different types of information stored in long-term memory. Each of the memory structures is distinct and serves a different operational function. However, it is evident that some type of very specialized categorization system exists within the human mind. One of the first to make this idea explicit was Bruner (as cited in Anderson, 1998). “Based upon the idea of categorization, Bruner’s theory states ‘To perceive is to categorize, to conceptualize is to categorize, to learn is to form categories, to make decisions is to categorize’.”

4.7 Computer–Mind Analogy (Fig. 4.3)

The development of computer in the 1950s and 1960s had an important influence on psychology and was, in part, responsible for the cognitive approach becoming the dominant approach in modern psychology (taking over from behaviorism). The computer gave cognitive psychologists a metaphor, or analogy, to which they could compare human mental processing. The use of the computer as a tool for thinking how the human mind handles information is known as the computer analogy. Essentially, a computer codes (i.e., changes) information, stores information, uses information, and produces an output (retrieves information). The idea of information processing was adopted by cognitive psychologists as a model of how human thought works. For example, the eye receives visual information and codes information into electric neural activity which is fed back to the brain where it is “stored” and “coded.” This information can be used by other parts of the brain relating to mental activities such as memory, perception, and attention. The output (i.e., behavior) might be, for example, to read what you can see on a printed page. Hence, the information processing approach characterizes thinking as the environment providing input of data, which is then transformed by our senses. The information can be stored, retrieved, and transformed using “mental programs,” with the results being behavioral responses. Cognitive psychology has influenced and integrated with many other approaches and areas of study to produce, for example, social learning theory, cognitive neuropsychology, and artificial intelligence.

Fig. 4.3
An illustration depicts a human brain partitioned equally into two parts. The structure of the brain is depicted on the left, and a computer chip with integrated circuits is depicted on the right.

The computer–mind analogy

4.7.1 How Does Psychology Relate to Information Processing?

During the 1960s, American psychologists investigating and exploring the principles of cognitive theories ultimately developed a new approach called cognitive psychology or information processing. Cognitive psychology included a spectrum of processes like attention, perception, thinking, remembering, and problem-solving.

They fully gave up studying learning in isolation, and this resulted in studying human learning as a whole rather than its different components. The term cognition refers to the processes through which information coming from the senses is transformed (Fig. 4.4).

Fig. 4.4
A cartoon depicts the theme of information processing. Four animals stand below a tree and transfer information from one to other.

Natural way of information processing

4.7.2 Metacognition

  • Metacognition is our knowledge about attention, recognition, encoding, storage, and retrieval and how those operations might best be used to achieve a learning goal.

The Nature and Importance of Metacognition

  • It contains what we know about how person variables, task variables, and strategy variables affect learning.

  • Thus, it determines the extent to which students can be strategic learners.

Age Trends in Metacognition

  • Primary grade children have limited knowledge of:

    • Their memory capability.

    • Factors that affect reading comprehension and recall—the need to tailor learning tactics to task demands—when they have learned something well enough that they can pass a test.

  • Metacognitive knowledge develops with age, experience, and instruction.

4.8 Implications and Applications of Information Processing

The thoughts described above outline a framework that can potentially work in an accepted environment. When presented in this sense, we conclude the framework and be applied to human cognition by way of precisely describing nature with the aid of natural determination and, most particularly for our purposes. If this definition accurately characterizes human cognition, some consequences drift in fields such as education, training, and data transport for the character of human cognition and implementations like thinking, decision-making, problem-resolving, and preparation. It can additionally be claimed that from the current theory, the very core of human cognition, our ability of imagine, resolve problems, make decisions, and put in order is lacking. If so, that would make the hypothesis as a theory of human cognition miserably incomplete. In this part, we explain how these fundamental human cognitive processes will account for the values. Awareness related to the use of common problem-solving techniques is key biological awareness. There are no common trouble-solving techniques on hand to us that are teachable and learnable because we have developed to enhance such approaches as the most important knowledge. They can’t educate people on how to use a change in the fact-end strategy, because all common people use the strategy immediately without guidance. Although primary biological knowledge is theoretically stored in an information stored, we do not actively obtain key knowledge using the usual concepts of information mentioned above. For instance, the idea of a functioning memory which is reduced in ability while dealing with new information, but which has no boundaries while working with long-term memory structured information do not have a place while dealing with predominant evolutionary awareness. When the thought processes and trouble-solving comprise key knowledge, the concepts of usual information processes become important when these procedures are extended to secondary knowledge. This might not be teachable to use a mean-end technique because we might be taught to use this naturally as part of our main biological knowledge. For example, using this tactic to a new scientific problem brings into play the concepts of general information processing structures, since mainly scientific knowledge is physically secondary. We may perhaps not require to actively process the methods of make sense to say-end analysis in operational memory due to the mechanisms of mean-end analysis were acquired as primary knowledge, but the less important information of new calculations in operational memory needs to be processed. While dealing with physically minor knowledge, the interrelationship between borrowing and restructuring, arbitrariness as genesis, and values of the ecological association and linking that to classify how cognitive method at a higher level function. When we retain information, long-term memory information is caught up in working memory in the first instance, using the concept of environmental organization and linking. Something new has been developed at this stage because we are simply placing the facts earlier learned into operational memory. Thinking needs one or two additional procedures, or together. Using the borrowing and reorganizing theory, we can reorganize the information just as information is restructured during sexual reproduction, substitute merging, or transposition. If we have not received supplementary information signifying how to reorganize the preceding information, then just as in the case of sexual reproduction, we need to restructure the information randomly before checking for usefulness. Does the latest, restructured information reach our environmental targets? If it performs, we may use the new restructured knowledge by storing it for later use in its current form in long-term memory or for further reorganization during thinking. Most learning can be imagined to happen through this restructuring cycle just like most variability in evolution. Fully new information is generated, on less frequent occasions. If information is moved from long-term to functioning memory using the classifying and connecting theory of the system rather than restructuring it, the information may be altered arbitrarily just as the metamorphosis changes the information in DNA. This spontaneous modification of long-term store knowledge is also a part of thought process. As is the case with restructuring, the effects of modifying the information arbitrarily may not be understood before the modification happens. The thinking procedure allows us to change the information spontaneously and then, as in the case of alteration, check the modification for usefulness with successful modifications in long-term memory being available for storage. We consider that these exchanges between borrowing and restructuring, arbitrariness as origin, and concepts of ecological organization and linkage form the base for elevated processes. When solving problems, either we restructure preceding information and check it for efficacy as it happens during sexual reproduction or arbitrarily create new information as it occurs during transformation, which also needs to be tested for validity. Failure to learn long-term memory indicates that randomization seems inevitable. Consider a trouble resolver trying to address a problem by solving analog problems. If we know how to solve the targeted problem via the source equivalent, the random generation and test component of the successful and the new information can be processed. Sometimes attempts to solve analog problems do not succeed because any of the correct analog is not used or is misused—if we do not have information about how to employ an analog, the process needs arbitrary creation and testing. We propose that these procedures might indicate why it can be so hard to reason by analogy. It must be noted that analysis by similarity involves concurrent planning of information on two or more problems which are likely to force a heavy work memory load. The equivalent mechanism allows for a viable decision-making sub-stratum. It can be concluded that all decisions rely on a mixture of preceding knowledge where that information is existing and arbitrary production and testing to the degree it is not obtainable. The way the intelligence used includes inaccurate information based on feelings or other variables and contributes to renowned, decision-based cognitive illusions. Planning is an activity characteristic of human action. This can be viewed as a specific example of consideration and can thus be studied using relations between borrowing and restructuring, arbitrariness as genesis, and concept of ecological organization and linking. We learn to make use of specific plans and specific preparation procedures. As with other cognitive tasks, most of our strategies are derived from the long-term memory of our own or of someone else. They can be restructured to suit a particular situation, but if so, to decide whether the reorganized plan is successful, they need an arbitrary creation and testing process. The random generation and testing of novel procedures will generate entirely new plans. The theory outlined here does not eradicate thought, but efforts to explain it in terms of borrowing and reorganization interactions, randomness as genesis and concepts of environmental organization and linking. Whether the theory is true may rely on moral rather than empirical issues, at least in some respects. For example, while the concept of randomness as genesis may be controversial in human cognition, we cannot explicitly check whether a person must invoke that principle rather than a choice in the lack of appropriate information because we are incapable to produce any functional substitute. If applicable information, including acquaintance that could allow us to simplify the similar circumstances, is inaccessible, there seems to be no substitute for random generation and testing to which we could develop an empiric test. No such solution appears to exist either in natural or artificial information processing systems. In comparison, we do learn that natural selection utilizes random generation and checking during evolution. As noted above, with regard to empirical concerns, as problem solvers approach complex, multi-move problems, they will often arrive at more dead ends than proper moves. The random generation and testing will explain several dead ends that an alternative would also need to explain.

4.9 Summary

The interpretation and reasoning approach to information-processing (IP) emerged as a response to behaviorism. This response concerned primarily the nature of scientific psychological explanation. The behavioral “standard” account, phrased in purely external terms, was substituted with a “realistic” account, articulated in words of internal institutions and processes. The study of the abstract vocabulary used in IP psychology indicates that the case is undisciplined. An analysis of the abstract vocabulary used in IP psychology reveals that a large number of terms are used simultaneously, there is no unambiguous level of analysis, and basic principles such as knowledge and retrieval remain largely unclear. Nevertheless, the IP method has evolved over the past 25 years into a complex and systematic experimental science. A glance at real action indicates the fundamental cause of his success. The approach is not so much concerned with the human information processor’s absolute or intrinsic properties, but with what its relative or differential properties can be termed. In terms of the systematic vocabulary of a conceptual system, further study of this function of the IP methodology renders clear the reason for its success. The IP method can be seen as constructing an analytical discrepancy calculus on an undefined class of objects, phrased in terms of theoretical “theory-neutral” vocabulary of interpretation, and with operators structurally similar to logical operators. The reinterpretation of what the path to IP is all about offers a range of benefits. It reinforces its role as an autonomous discipline, clarifies its connection within the cognitive science community to other methods to psychology and other sciences, and renders it independent of methodological subtleties.