Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Forethought

Is the design process more like an orderly progression from beginning to end, as in the image of the stair steps on the left (Figure 11.1), or a somewhat unpredictable adventure as in the picture of the skier? What kinds of tools can help with this process?

Figure 11.1.
figure 11_1_187874_1_En

Orderly Progression or Unpredictable Adventure?

Processes and the tools that support them can be orderly and predictable or little more than general guidelines. The first type can be classified as algorithmic and the second as heuristic . Algorithmic processes and tools lead predictably to solutions. These include such things as flowcharts or checklists that explain how to start or stop a piece of machinery, how to fill out an application for social security, how to compute the amount of interest you will pay over the lifetime of a loan, recipes for baking cakes, and instructions for how to replace a flat tire. These tools are like calculators in that they provide answers, or guaranteed solutions to problems if they are applied exactly as intended.

Other tools help you arrive at a solution but do not guarantee that you will achieve it. They are called heuristics and are more in the nature of “rules of thumb.” They give you leverage in trying to solve your problem, but still require an application of your own problem-solving skills to achieve it. Examples of these are checklists that provide lists of things to consider packing for overseas travel, other kinds of packing lists, symptoms to consider when trying to identify a problem with your health or automobile, or a strategy or set of tactics to use when designing instruction or learning how to troubleshoot an electrical circuit. The directional indicators in the skier illustration (Figure 11.1) are examples of heuristic tools. They give the skier a general sense of what direction to go, but there are too many uncontrollable factors in the skier’s environment to be able to guarantee that the skier will arrive at an exact destination.

When attempting to understand human behavior and design solutions or interventions to improve peoples’ knowledge, performance, or attitudes, there are far more heuristics available than algorithms. Landa (1974, 1976) conducted intensive studies of the development and use of algorithms in the context of instruction, learning, and performance and he demonstrated how powerful and effective they can be. However, they are limited in their scope of application. They are particularly effective in technical and procedural areas of learning and performance, but not as feasible in problem-solving or knowledge construction areas. The situation in motivational contexts is even more limited where most aspects of motivational design are heuristic. The tools described in this chapter can help you be more consistent in your efforts to analyze learner motivation and to design strategies, but it is still necessary for you to use judgment combined with experience and creativity to create effective motivational strategies. Your ability to produce innovative, creative solutions will grow with experience and effort. This is a learnable skill and it has been demonstrated by many people who have applied themselves to implementing the systematic motivational design process.

Introduction

This chapter contains six tools that support or supplement those that were presented in earlier chapters. Four of them provide heuristic support for specific kinds of motivational design tasks and two are measurement instruments that have been validated and used in many research studies.

  • Simplified approach to motivational design

  • Motivational Idea Worksheet

  • Two motivational measurement instruments

    • The Course Interest Survey

    • The Instructional Materials Motivational Survey

  • Motivational Tactics Checklist

  • Motivational Delivery Checklist

Motivational Design Matrix: A Simplified Approach

Introduction

The complete ten-step motivational design process is useful for large-scale projects as when motivationally enhancing a whole course or workshop, or when multiple people are working on the project. The ten-step model provides guidance for in-depth analysis of the audience and environment and supports documentation of each step for coordination and future reference. However, many projects do not require this degree of support. When a teacher or instructional developer is designing a single lesson or module, a much simpler approach would be desirable.

Suzuki developed and validated a simplified approach to motivational design (Suzuki & Keller, 1996) that was subsequently applied in two distributed learning environments. The first was in the development of motivationally adaptive computer-based instruction (Song & Keller, 2001). The second application was in the development of student support methods for a multinational distance learning course (Visser, 1998).

In Sendai, Japan, a team of 25 teachers in eight subject areas at Sendai Daichi Junior High School had been developing computer application projects for several years as part of a demonstration project sponsored by the Japanese national government. During the final two years of the project (1994–1995) they were asked to incorporate systematic motivational design into their process. Suzuki (as reported in Suzuki & Keller, 1996) developed the simplified approach to motivational design because the full, ten-step model would require too much time for training and implementation. The goal of the simplified approach was to ensure that the teachers would identify key motivational characteristics in the learners, the content area to be taught, and the hardware or software to be used. The teachers then evaluated this information and prescribed tactics based on identified motivational problems. This process helped ensure that teachers avoided the inclusion of excessive numbers of tactics, or tactics derived from their own preferred areas of interest without regard to the characteristics of the students and the situation.

The resulting design process is represented in a matrix (Table 11.1). In the first row, the designer lists salient characteristics of the learners’ overall motivation to learn. The second row contains the designer’s judgments about how appealing the learning task will be to the learners. The third and fourth rows ask about learners’ expected attitudes toward the medium of instruction and the instructional materials. Each of the entries in these rows has a “plus” or “minus” sign to indicate whether it is a positive or negative motivational characteristic. Based on the information in these first three rows, the motivational designers decide how much motivational support is required and what types of tactics to use. They refer to reference lists of potential tactics (Keller & Burkman, 1993; Keller & Suzuki, 1988) and also create their own based on the identified needs.

Table 11.1. Report Matrix to Support Simplified Design Process (adapted from Suzuki & Keller, 1996).

In this example, the Internet teacher determined that confidence is the only real problem area, and he listed some specific things to deal with it. He also listed some specific tactics for the other categories, but they serve to maintain motivation instead of solving a specific problem. A benefit of his application of this process was that in his initial motivational plan, before he applied this process, this teacher had a much longer list of tactics that he thought would be exciting and motivational. After doing the analysis and applying various selection criteria that are listed in the training materials on motivational design, he realized that his list of tactics would be too time consuming and would actually distract from the students’ intrinsic interest in the subject. By using the design process, he was able to simplify the motivational plan and target it to specific needs.

An evaluation of the effectiveness of this motivational design process (Suzuki & Keller, 1996) verified that the teachers were able to use the matrix accurately with only a few entries not being placed appropriately, and more than two-thirds felt that it definitely helped them produce a more effective motivational design. Some teachers had difficulties with the analysis phase, which indicates that this is a critical area to address in training people to use the process.

Application in Motivationally Adaptive Instruction

This simplified design process was modified and has been used in at least two subsequent projects. The first of these was the in the development of motivationally adaptive computer-based instruction and the second was in a distance learning course.

One of the challenges that has been mentioned in regard to conducting the formal motivational design process to guide the development of motivational strategies for the beginning of a course is that learner motivation changes over time and it can change in unpredictable ways. In a classroom or other instructor-led setting, an expert instructor can continuously gauge the audience’s motivational condition and make adjustments as appropriate. But in self-directed learning environments, this type of continuous adjustment has not been a feature. Once the instruction has been designed and “packaged,” everyone receives the same program, with the exception of limited branching and other learner control options. These options can have a positive effect on motivation, but they do not adequately reflect the range of motivational conditions that characterize learners at different points in time.

It would be possible to include a large number of motivational tactics to cover a broad range of motivational conditions, but this would most likely have a negative effect on motivation and performance. The reason is that when students are motivated to learn, they want to work on highly task-relevant activities. They do not want to be distracted with unnecessary motivational activities such as extrinsic games or “ice breakers.” For this reason, it would be beneficial to have computer or multimedia software that can sense a learner’s motivation level and respond adaptively.

Song (Song, 1998; Song & Keller, 2001) designed and tested an approach to motivationally adaptive instruction. He built checkpoints into an instructional program on genetics for junior high school students. At predetermined points, students in the primary treatment group received a screen asking several questions about their motivational attitudes. Based on the responses, which were compared to actual performance levels, students would receive motivational tactics designed to improve attention, relevance, or confidence. He used a variation of the simplified ARCS model design process to create specifications for tactics to be included in the adaptive treatment. The resulting motivation and performance of this group was compared to a group that received highly efficient instruction with only a minimum of motivational tactics that centered primarily on acceptable screen layout. A second comparison group received the maximum number of tactics; that is, they received all of the tactics that were in the pool of potential tactics for the treatment group.

The results indicated that both the adaptive and full-featured treatments were superior to the minimalist treatment. In most instances, the adaptive treatment was superior to the full-featured one. There were limitations on the types of computer features that could be used in this study (for example there was no sound), but a more sophisticated treatment and also one which was longer than one hour would, based on these results, be expected to show even stronger treatment effects.

This study was a pioneering effort. Earlier papers that discussed or tested adaptive motivational design (Astleitner & Keller, 1995; del Soldato & du Boulay, 1995) were extremely rigorous but more limited in their approach; that is, they tended to focus on a particular aspect of motivation such as persistence or confidence. Song’s study is more holistic and provides a good foundation for a series of follow-up studies.

Application in Distance Learning

The second extension of the simplified design process is in distance learning (Visser, 1998) and provides another example of the multicultural nature of this work. Visser, who lived in France, conducted her research with a distance learning course offered by a university in the United Kingdom and was working under the sponsorship of her university in The Netherlands. Furthermore, her study included an adaptation of a motivational strategy developed and validated in an adult education setting in Mozambique (Visser & Keller, 1990).

There is no doubt that there are serious motivational challenges among distance learners especially when they are not able to avail themselves of social networks. This was the case in Visser’s sample and would still be true in many parts of the world that rely on paper-based instructional materials that are distributed to distant learners with limited or no Internet support. The attrition rate alone can be viewed as an indication of motivational problems. Student comments often focus on their feelings of isolation, lack of feeling of making steady progress, and great doubts about being able to finish the course given their other responsibilities and time constraints.

Visser (Visser, 1998) used the simplified ARCS model design process to analyze the audience, conditions, and potential solutions in her situation. Her application of this process was contextualized in two ways. First was its restriction to a somewhat formal and traditional distance learning course, which uses textual material supplemented by an occasional audio or videocassette. Based on her global assessment of the motivational problems in this situation, she concluded that it might be possible to have a positive effect on motivation by focusing on the student support system rather than on the instruction which could not be easily revised.

The second way in which her study was contextualized was its focus on the validation of a particular motivational strategy, although it does allow for the incorporation of multiple tactics. Her approach was to implement a program of “motivational messages” that would be sent to students according to two schedules. The first was a set of fixed points based on predictions of the moments during the course when these messages might have the strongest effect. These messages were the same for everyone. The second schedule consisted of personal messages sent to students when the tutor deemed it appropriate. These messages were in the form of greeting cards, which conveyed messages of encouragement, reminders, empathy, advice, and other appropriate content areas.

Design of the messages was based on the results of her application of the simplified design process. It was similar to the format created by Suzuki, but she modified the row headings to include specific aspects of the situation to consider in the analysis (Table 11.2). It is well known in distance education courses that learners typically are positive and excited at the beginning of a class but become disinterested and discouraged later in the course. Therefore, she made a distinction between her predictions of precourse attitudes (Row 1) and midterm attitudes (Row 2). Her responses to those issues came primarily from her experience in teaching this type of course with similar audiences. The third row (Table 11.2) predicts attitudes toward the course content and the fourth row asks about their attitudes toward the support they receive while taking the course. In the fifth row she summarized the results of the first four rows and used this as a basis for deciding what motivational tactics to use. Her matrix (L. Visser, 1998) provided an effective summary of major issues and decisions even though she went beyond the matrix in the final stages of designing strategies.

Table 11.2. Design Factor Categories from Visser (1998).

To assess the effectiveness of this intervention, she compared retention rates in the experimental section of the course to three other sections that did not receive motivational messages and she did a qualitative review of student responses to various course evaluation and feedback instruments. She did not ask them directly about the effects of the motivational messages to avoid stimulating attitudes that may not have been present spontaneously in the students’ minds. Improved retention rates of 70–80%, which are similar to conventional education, and student comments both offered clear support for the motivational messages.

Motivational Idea Worksheet

Introduction

The Motivational Idea Worksheet (Table 11.3) is even more simplified than the simplified design matrix. This worksheet can be quite helpful in generating an idea for a single motivational tactic to use in a specific place in a lesson, or simply recording an idea for a motivational tactic that has popped into your mind even if you don’t yet know when or where you will use it. Have you ever been listening to a presentation at a conference, observed a colleague using a tactic that you thought was interesting, or read something that gave you an idea? If so, this one-page worksheet is useful for documenting your idea and cataloging it according to its motivational characteristics. I have students in my motivational classes and workshops fill these out as we go through the psychological foundations part of the program. Then, when we get to the design process they already have several ideas to incorporate.

Table 11.3. Motivational Idea Worksheet.

Instructions

In the Setting block (Table 11.3) you can include information about the learning topic or class that this idea applies to. The motivational idea that you are recording or trying to create could apply to a foreign language vocabulary lesson, a Civil War history lesson, a math lesson on triangles, an instructional systems lesson on system theory and cybernetics, or anything else. Also in this block, you can make comments about the intended audience and other pertinent information such as whether your idea is for middle school children with math anxiety, the accelerated learners in your 3rd period German language class, or masters’ students in instructional systems who have no background in training design or delivery.

In the Situation, or Problem, block you can briefly describe the situation in which the motivational problem is occurring or will be expected to occur. It could be such things as the introductory part of a lesson on the Pythagorean Theorem in your middle school math class, the orientation lecture when inexperienced masters students are in the same classroom with students who have had work experience related to the topic, creating interest among advanced German language students who will be bored with the standard lesson, or creating a positive learning climate with employees assigned to mandatory training.

The third step is to check the motivational goals that you are hoping to achieve. For example, the primary goals with the German language students might be A.2 (Stimulate an attitude of inquiry) and R.3 (Tie to experiences). Identifying your specific motivational goals helps keep you focused on the purpose of the motivational idea as you compose it. This step constitutes your motivational analysis of the audience and helps you focus your idea for a motivational tactic.

The largest block on the page is where you compose your motivational idea. You can describe it in any way you wish, but you might want to have one or two sentences that give an overview of what it is and then elaborate on it. Also, it is a good idea to record your thoughts about how you will implement it. Will you divide the class into groups? How many? Do you have a YouTube video that you will include? What is the URL? (Keep in mind that YouTube is not totally stable; sometimes a favorite video disappears.)

Finally, if you implement your strategy, you can record the results. New ideas are sometimes totally successful on the first try, but often they aren’t, at least not in the way you intended! Your best strategies will probably result from modifications and fine-tuning or your original idea.

ARCS-Based Measures of Motivation

Introduction

There are two measurement tools that can be used in conjunction with the ARCS model. The first, called the Course Interest Survey (CIS), was designed to measure students’ reactions to instructor-led instruction. The second, called the Instructional Materials Motivation Survey (IMMS), was designed to measure reactions to self-directed instructional materials. These are situation-specific self-report measures that can be used to estimate learners’ motivational attitudes in the context of virtually any delivery system. The CIS can be used in face-to-face classroom instruction and in both synchronous and asynchronous online courses that are instructor facilitated. The IMMS can be used with print-based self-directed learning, computer-based instruction, or online courses that are primarily self-directed.

Furthermore, they were designed to be in correspondence with the theoretical foundation represented by the motivational concepts and theories comprising the ARCS Model (Keller, 1987a, 1987b). Because this theory incorporates psychological constructs from the empirical literature on human motivation (Keller, 1979, 1983, 1999), many of the items in the CIS and IMMS are similar in intent, but not in wording, to items in established measures of constructs such as need for achievement, locus of control, and self-efficacy, to mention three examples.

As situational instruments, the CIS and IMMS are not intended to measure students’ generalized levels of motivation toward school learning; that is, they are not trait- or construct-type measures. The goal with these instruments was to be able to measure how motivated students are with respect to a particular course. The expectation is that these surveys can be used with undergraduate and graduate students, adults in non-collegiate settings, and with secondary students. They can also be used with younger students who have appropriate reading levels. With younger students or ones who are not sufficiently literate in English, some of the items may have to be read aloud and paraphrased to relate them to the classroom experiences of the audience.

Furthermore, both instruments can be adapted to fit specific situations. That is, the “default” wording of items contains phrases such as “this course,” or “this lesson.” These can be changed to fit the specific situation that is being assessed, such as “this lecture,” “this computer-based instruction,” or “this workshop.” Also, it is possible to change the tense of the items to use them as a pretest. However, the substance of the items cannot be changed because they are based on specific attributes of motivation.

Development Process

A pool of potential items was developed for each instrument by reviewing motivational concepts, strategies, and measurement instruments. These items were reviewed by 10 graduate students who were well versed in the motivational literature. They responded to each item then discussed the items which seemed ambiguous, unrelated to the appropriate higher-order concept (i.e., attention, relevance, confidence, or satisfaction), or otherwise difficult to respond to.

The original item pool was reduced and revised with respect to obvious things that could be done to remove ambiguity, sharpen the key concept in each item, and eliminate “double barreled items by dividing them into two items or improving their focus.” Then, the items were subjected to a further ambiguity check by responding in a contrived manner called “faking it.” A different group of ten adults who were mostly graduate students but not experts in the area of motivational knowledge were told to respond twice to the instrument they received. The first time they were to “fake good,” and the second time to “fake bad.” That is, they were to assume they were taking a course which was highly motivating and to answer each item in a way that would indicate their highly positive motivation. Second, they were to assume that the course was totally unmotivating and to answer accordingly. This test revealed a few items that could “go either way,” which meant they were poor discriminators. For example, both motivated and unmotivated students could agree with an item such as, “the instructor is very likeable.” These items were then revised and retested or deleted.

Course Interest Survey

The CIS has 34 items with approximately equal numbers in each of the four ARCS categories. The items are listed (Table 11.4) in the order that they are normally administered. However, each of the four subscales can be used and scored independently. Also, the format of the survey can be modified to use Likert-type scales and electronic scoring methods. In this section, descriptions of the scoring procedure, reliability estimation, and original validity test are presented.

Table 11.4 The Course Interest Survey Instrument.

Scoring

The CIS can be scored for each of the four subscales or the total scale score (Table 11.5).The response scale ranges from 1 to 5 (see Table 11.4). This means that the minimum score on the 34 item survey is 34, and the maximum is 170 with a midpoint of 102. The minimums, maximums, and midpoints for each subscale vary because they do not all have the same number of items.

Table 11.5. Scoring Guide for the Course Interest Survey (CIS).

An alternate scoring method is to find the average score for each subscale and the total scale instead of using sums. For each respondent, divide the total score on a given scale by the number of items in that scale. This converts the totals into a score ranging from 1 to 5 and makes it easier to compare performance on each of the subscales.

There are no norms for the survey. As it is a situation-specific measure there is no expectation of a normal distribution of responses.

Scores are determined by summing the responses for each subscale and the total scale. Please note that the items marked reverse (Table 11.5) are stated in a negative manner. The responses have to be reversed before they can be added into the response total. That is, for the reversed items, 5 = 1, 4 = 2, 3 = 3, 2 = 4, and 1 = 5.

CIS Internal Consistency (Reliability) Estimation

The survey was first administered to a class of 45 university undergraduates, and the internal consistency estimates were satisfactorily high. A pretest version was prepared by rewriting items in the future tense and was administered to an undergraduate class of 65 students. The internal consistency estimates were high, but further revisions were made to improve the instrument. The standard version of the survey was then administered to 200 undergraduates and graduate students in the School of Education at a university in the Southeast. Information was also obtained about the students’ course grades and grade point averages. The internal consistency estimates, based on Cronbach’s alpha, were satisfactory (Table 11.6).

Table 11.6. CIS Internal Consistency Estimates.

CIS Situational Validity

CIS scores from the 200 university undergraduates and graduates used for internal consistency estimation were correlated with their course grades and grade point averages (Table 11.7). All of the correlations with course grade are significant at or beyond the 0.05 level, and none of the correlations with grade point average are significant at the 0.05 level. This supports the validity of the CIS as a situation-specific measure of motivation, and not as a generalized motivation measure, or “construct” measure, for school learning.

Table 11.7. CIS Correlations with Course Grade and GPA.

Instructional Materials Motivation Scale

The IMMS has 36 items. The Relevance and Confidence subscales both have 9 items, the Satisfaction subscale has 6, and the Attention subscale has 12. The primary reasons for the disproportionate numbers of items in the Attention and Satisfaction subscales are that boredom and lack of stimulation are such ubiquitous characteristics in instructional writing and the satisfaction category does not have as many points of connection to printed material as the others. As with the CIS, the IMMS items are listed (Table 11.8) in the order that they are normally administered, but each of the four subscales can be used and scored independently. Also, the format of the survey can be modified to use Likert-type scales and electronic scoring methods. In the remainder of this section as in the previous one, descriptions of the scoring procedure, reliability estimation, and original validity test are presented.

Table 11.8. The Instructional Materials Motivation Survey Instrument.

Scoring

As with the CIS, the IMMS survey can be scored for each of the four subscales or the total scale score (Table 11.9). The response scale ranges from 1 to 5 (see Table 11.8). This means that the minimum score on the 36 item survey is 36, and the maximum is 180 with a midpoint of 108. The minimums, maximums, and midpoints for each subscale vary because they do not all have the same number of items.

Table 11.9. IMMS Scoring Guide.

An alternate and preferable scoring method is to find the average score for each subscale and the total scale instead of using sums, especially with the unequal sizes of the subscales. For each respondent, divide the total score on a given scale by the number of items in that scale. This converts the totals into a score ranging from 1 to 5 and makes it easier to compare performance on each of the subscales.

One cannot designate a given score as high or low because there are no norms for the survey. Scores obtained at one point in time, as in a pretest, can be compared with subsequent scores or with the scores obtained by people in a comparison group. Also, as it is a situation-specific measure, there is no expectation of a normal distribution of responses.

Scores are determined by summing the responses for each subscale and the total scale. Please note that the items marked reverse (Table 11.9) are stated in a negative manner. The responses have to be reversed before they can be added into the response total. That is, for these items, 5 = 1, 4 = 2, 3 = 3, 2 = 4, and 1 = 5.

IMMS Internal Consistency (Reliability) Estimation

The survey was administered to a total of 90 undergraduate students in two undergraduate classes for preservice teachers at a large Southern university. The internal consistency estimates, based on Cronbach’s alpha, were satisfactory (Table 11.10).

Table 11.10. IMMSS Reliability Estimates.

IMMS Validity Test

Validity was established by preparing two sets of instructional materials covering the concept of behavioral objectives. These materials were part of a unit of work on lesson planning and instructional design in an applied educational psychology course for undergraduate preservice teachers. Both lessons had the same objectives and technical content. The lesson for the control group was prepared according to standard principles of instructional design, but was not enhanced in any way to make it interesting. The experimental lesson was enhanced with strategies to stimulate curiosity, illustrate the practical relevance of the content, build confidence, and provide satisfying outcomes. Students were randomly assigned to the two lessons which they completed during one class period, including testing. Scores on the experimental lesson were significantly higher than for the control lesson.

Status of the CIS and IMMS

The four subscales of these instruments can have high intercorrelations which makes it difficult to apply traditional factor analysis to the instrument and obtain this factor structure (Huang, Huang, Diefes-Dux, & Imbrie, 2005). This is in part because these instruments are designed to measure situation-specific attitudes and not psychological constructs. Situational measure can vary tremendously, especially when respondents have a largely positive attitude toward the given situation. Thus, we used other methods to support the conceptual structure of the ARCS model and the associated measurement instruments. Naime-Diffenbach (1991) manipulated the motivational properties of a set of instructional materials by enhancing their attention and confidence characteristics and stripping all possible motivational aspects of their relevance and satisfaction characteristics. Her study confirmed that when there is actual variation in materials in accordance with these motivational dimensions the differences will be reflected in scores on the measurement instrument.

Small and Gluck (1994) used a magnitude scaling approach to estimate the perceived closeness or distance between different aspects of motivational attributes and their study confirmed the four component taxonomy of the ARCS theory. Recent studies such as Yu’s (2008) study of motivation and usability in a self-paced online learning environment provide additional confirmation of the internal consistency and empirical validity of the IMMS.

Both of these instruments have been used in many studies and have even been translated into several other languages. It is beyond the scope of this book to include a summary of this literature, but its existence helps confirm the utility and validity of these instruments.

Motivational Tactics Checklist

Introduction

The Motivational Tactics Checklist (Table 11.11) was developed and revised over a period of several years. It focuses primarily but not exclusively on “print” types of material, whether on paper or on computer displays. It can be used to guide critiques of existing materials or for ideas when creating new materials. However, this is a catalog of ideas and characteristics, not a set of recommendations for every instructional product. As with all other aspects of motivational design, this list should be used in conjunction with the results of an audience motivational analysis. The instructions provide some guidance about how this is done.

Table 11.11. The Motivational Tactics Checklist.

Instructions

When using this checklist in support of the design and development of new materials, it can be reviewed for ideas and display features that are appropriate based on the characteristics of your audience. In this case, you can develop an adaptation of this list to include only those items which you will use to guide design and as an evaluative checklist during developmental tryouts and pilot tests.

When you use this checklist for guidance when reviewing existing products, there are several decisions to be made. As you examine the materials and consider the tactics, you can decide whether the materials are satisfactory, have a deficiency, or have an excess of tactics.

  • Satisfactory: This means that the given tactic is contained in the materials and it is an appropriate tactic to use with your audience.

  • Deficiency: A deficiency is when you identify motivational tactics to add to the materials given the characteristics of your audience.

  • Excess: An excess occurs when the materials contain motivational tactics that are not appropriate for your audience. An example of excess would be the inclusion of a set of cartoons that are considered to be “babyish” for the intended audience or to detract from the seriousness of the content.

As before, you can develop a custom version of this checklist by deleting all the items that do not apply, adding supplemental tactic descriptions, and then using it as a design and evaluation aid.

Motivational Delivery Checklist

Introduction

The Motivational Delivery Checklist (Table 11.12) was developed by Bonnie H. Armstrong and me and is included here with the permission of both of us. This checklist is used to assess the motivational characteristics of instructor-led classes. It incorporates features that pertain to presentation style, learner focus, and other elements of good motivational and instructional design. In some cases, a less than satisfactory performance by the instructor could be the result of deficiencies in the material provided to the instructor. The investigation into the cause of the problem would occur after the problem was noted on this worksheet. The instructor can also use this checklist in advance of instruction to assess the completeness of the instructional materials, and as an aid to rehearsal.

Table 11.12. List of Items in the Motivational Delivery Checklist.

Instructions

You may use the checklist in any manner that serves your requirements. One approach is to check each item using the following notation:

  • E = ExcellentO = Omitted (But should be included)

  • S = SatisfactoryNA = Not Applicable

  • I = Needs Improvement

Summary

This chapter described several design aids, measurement instruments, and checklists that can assist you in motivational design and research activities. All of these items have been created in the context of application and proven to be useful, but they are only a small sampling of tools that people have developed, many of which I have been told about but have not seen. If you begin to implement the systematic process of motivational design, it is highly likely that you will create tools that benefit you as you do this work.

To me, the most important point to take from this chapter is that it underscores the fact that motivational design can be a systematic process. You can follow the process described in this book and use various tools and methods of documentation to constantly improve your skills with practice!