Introduction

For over 20 years researchers have been trying to capture/understand how experienced instructional designers apply their knowledge and skill to solve complex problems of practice (Ertmer et al. 2008; Ertmer et al. 2009; Eseryel 2006; LeMaistre 1998; Rowland 1992; Wedman and Tessmer 1993). Among these efforts, researchers have examined how the design problem-solving process differs among experts and novices (Ertmer et al. 2008; Hardré et al. 2006; LeMaistre 1998; Rowland 1992), as well as the extent to which experienced designers use the instructional design (ID) model and/or its components in their work (Wedman and Tessmer 1993; York et al. 2009).

Although there is some evidence to suggest that experienced designers apply ID models in their practice (York et al. 2009), they typically report adapting these models to their specific situations, using them heuristically rather than algorithmically (Kirschner et al. 2002; Nelson 1988; Romiszowski 1981). According to Dudczak (1995), heuristics are general guidelines that experienced designers apply when making decisions under uncertain conditions, thus “minimiz[ing] cognitive expenditure” (Lewis 2006, p. 264) during ill-structured problem solving. Based on principles similar to those espoused by cognitive load theory (which suggests that working memory has a finite amount of processing capacity; Sweller et al. 1998), heuristics may offer one means for reducing the cognitive load experienced while solving difficult or complex problems (Lewis 2006).

Recently, Silber (2007) suggested that, when solving ID problems, instructional designers follow a set of heuristic principles, rather than procedural ID models: “ID, as experts do it, is a problem-solving process, not a procedure, made up of a thinking process and a set of underlying principles” (p. 6). This is similar to what Kirschner et al. (2002) reported, “While ID models often inspire designers, their activities typically don’t reflect the systematic, step-by-step approach as prescribed in traditional ID models” (p. 91, emphasis added) and thus, “designers’ implicit cognitive strategies and rules-of-thumb heavily influence the design process” (p. 87).

However, despite this acknowledgement in the literature, we actually know very little about the heuristics designers use, what they look like, or how they relate to key ID competencies. As noted by Gero (cited in Kirschner et al. 2002), “Given the large body of research design it is surprising how little we know about designing” (p. 61). This study was designed to fill that gap, that is, to identify a set of common heuristics experienced designers report as being important to the instructional design process. Furthermore, we hoped to understand how these heuristics related to key competencies expected of practitioners, as outlined by the International Board of Standards for Training, Performance, and Instruction (IBSTPI 2000) (see Table 1).

Table 1 IBSTPI instructional design competencies

Where do heuristics come from?

According to Kirschner et al. (2002), “In most design projects, deviations and discrepancies from the general ISD model occur as design practitioners selectively follow ID model prescriptions” (p. 93). While many of these deviations are likely due to specific project constraints, such as time and budget (Wedman and Tessmer 1993), they also can be attributed to the unique set of design experiences the practitioner brings to the table. Jonassen (1997) explained that because of the uncertainty encountered during the problem-solving process, experts tend to rely on knowledge gained from past experiences rather than on that learned from textbooks.

The results from two studies by Perez and his colleagues support this suggestion. In their first study (Perez and Emery 1995), five experts and four novice designers were provided with a troubleshooting problem and asked to think aloud during the design problem-solving process. In their results, Perez and Emery reported that experts and novices used different types of knowledge; that is, whereas novices used theoretical knowledge (such as ID models), experts also used strategic knowledge, which was based on experience. In a follow-up article with the same participants, Perez et al. (1995) described how their expert instructional designers used ID principles or heuristics during the design process. For example, one of their participants “suggested that his practice was to treat theoretical principles as heuristics” (p. 340).

Given this, it seems reasonable to expect that the core knowledge and/or competencies of the field provide the foundation from which ID heuristics are derived. According to Romiszowski (1981), experienced designers apply heuristics during ill-structured problem solving, based on the ID models taught in school. This idea was supported by results reported by Ertmer et al. (2008) in their study of seven expert designers who engaged in a think-aloud process while reading and analyzing a complex ID case narrative: “…[participants] did not follow their models on a one-to-one basis like a recipe. Instead they used their models more broadly and heuristically” (p. 30).

Based on her research with 24 expert instructional designers, Visscher-Voerman (1999) identified 11 design principles, or heuristics, on which there was at least a 75% positive agreement among the participants. While some of the principles related to successfully implementing specific steps in the ID model (e.g., “An essential part of the analysis phase is a consideration of possible pitfalls and problems during the design and implementation phases,” p. 173), others related to professional and managerial competencies such as securing stakeholder buy-in (e.g., “During the design process, designers should pay as much attention to creating ownership with clients and stakeholders, as to reaching theoretical or internal quality of the design,” p. 173), and overseeing the entire design project (e.g., “Designers should share the responsibility for creating favorable conditions for the implementation of a design,” p. 173). This suggests that ID principles, or heuristics, stem, not solely from the ID theories or models learned in school, but from the entire set of responsibilities involved in designers’ practice.

What do heuristics look like?

Because heuristics are based, at least to some extent, on an individual designer’s unique experiences, the expectation is that they would be fairly idiosyncratic (Ertmer et al. 2009). According to Visscher-Voerman (1999), one of the factors that influences experts’ design processes and solutions is the designers’ frames of reference, which are comprised of their experiences, perspectives, and ideas from previous projects on which they had worked. Similarly, Ertmer et al. (2008) described how the frames of reference used by their seven experts to solve the given ID problem related primarily to their current roles in the field (e.g., administrator, consultant).

Yet, given the potential link between heuristics and the core knowledge and competencies of the field (York et al. 2009; Perez et al. 1995; Visscher-Voerman 1999), it may be possible to identify those heuristics that are shared and commonly applied by designers across a variety of contexts. Indeed, this was the purpose of the study conducted by Kirschner et al. (2002). Starting with the heuristics originally identified by Visscher-Voerman (1999), Kirschner et al. asked 15 expert designers to identify the “top 3 design principles” that were most important to the success of a design project. The results demonstrated strong agreement among the designers in both studies. This suggests that while heuristics may stem from a unique set of design experiences, they often have a universal quality to them as well (York et al. 2009). As such, it may be possible to identify those that are commonly applied by many different designers.

Purpose

Researchers generally agree that when individuals are making decisions or solving problems under uncertain conditions, they rely on heuristics derived from both past experiences and previous knowledge (Kirschner et al. 2002; Nelson 1988; Visscher-Voerman 1999). This study was designed to build on the findings of Visscher-Voerman and Kirschner et al., as well as on the results of a preliminary study we conducted with 16 experienced designers (York et al. 2009). Through our previous qualitative analyses of the practitioners’ design “stories,” we identified 59 heuristics expressed by one or more of the designers. Thus, the goal of the current study was to verify this initial list of heuristics by including a larger number of participants, working in a wider range of contexts. Furthermore, we hoped to identify the relative importance of each heuristic to the ID process by examining participants’ ratings of agreement (i.e., on a scale from 1- not at all important to 6 - very important) related to each heuristic. Finally, we examined the extent to which the identified heuristics related to core ID competencies, as outlined by the International Board of Standards for Training, Performance, and Instruction (IBSTPI 2000). This last action was performed as a first step in understanding the potential genesis and/or future application of each heuristic.

Method

The Delphi technique (Linstone and Turoff 1975), consisting of a series of questionnaires and feedback used to gather the collective judgment of a panel of experts (Dalkey and Helmer 1963), was used to verify the heuristics identified in the 2009 study (York et al. 2009) by a new panel of experienced instructional designers (n = 31). In the 2009 study, 16 experienced professionals were interviewed and asked to tell a story in which a complex or challenging instructional design problem was solved. From these stories, 59 heuristics emerged. In the current study, three successive Delphi rounds were conducted with a panel of 31 practicing instructional designers until consensus was reached.

Selection of participants

An email was sent to 54 experienced instructional designers requesting their participation in a Delphi study examining ID heuristics. After receiving only 14 responses, we posted a request on LinkedIn.com seeking additional participants. Eighty people responded for a total of 94. A demographic survey requested the following information: name, email, gender, age range, current position and title, formal education, summary of instructional design background, and instructional delivery formats currently used in their practice. From the demographic survey, a convenience sample (Patton 1990) of the 50 most experienced designers was selected, using criteria published in the ID and expertise literature (Eseryel 2006; LeMaistre 1998; Perez et al. 1995; Rowland 1992). This criteria included: (1) minimum of 10 years of hands-on experience, (2) currently practicing ID, (3) number and level of educational degrees, (4) nominated or recognized by peers, (5) diverse experiences, (6) on-going training/education/certification, and (7) manager/trainer (for apprentice instructional designers). The selected 50 designers were emailed an invitation to participate on the Delphi panel. Thirty-five responded positively. However, from Round I to Round II, 4 withdrew, leaving a panel of 31 participants who completed all three surveys.

The final panel consisted of 18 females (58%) and 13 males (42%). Ages were listed by range: 31–40 years (n = 5; 16%), 41–50 (n = 14; 45%), 51–60 (n = 10; 32%), and 61+ (n = 2; 7%). The panel members averaged 19.7 years of ID experience, ranging from 10 to 43 years. The highest degree earned by participants included technical diploma (n = 1), Associate’s degree (n = 1), Bachelor’s degree (n = 2), MBA (n = 4), M.Ed. (n = 2), Master’s degree (n = 14), Ed.D. (n = 3), and Ph.D. (n = 4). Although 4 of the panel members had less than a Master’s degree per criterion 3, their on-the-job training and experiences made up for the lack of a formal degree. For example, the participant with the technical diploma had over 22 years of experience and had earned a number of training certificates, including the ID Certificate from Darryl L. Sink, Dale Carnegie Certified Coach, and more, which meets criterion 7. The participant with the Associate’s degree had 25 years of instructional design experience as well as one-third of that time spent managing other instructional designers, which also meets criterion 7. All panel members were currently practicing instructional design, with a range of job titles. The primary job title of participants was instructional designer (n = 11). The second most frequent job title was consultant (n = 6).

Delphi process and timeline

The three Delphi rounds were conducted over a 2-month period. Surveys were provided online, hosted on a secure server. An email was sent to participants describing the Delphi procedure as well as how to access the surveys. Follow-up emails were sent to participants if they had not responded during the open period. Following the third survey, an email was sent to inform participants that the Delphi rounds had ended.

The Round I survey contained specific instructions as to how to access the survey, rate heuristics, and provide open-ended comments. The survey contained three parts: (a) a list of the 59 heuristics (identified in the previous study; York et al. 2009) to be rated on a 6-point Likert-scale (from 1 = strongly disagree to 6 = strongly agree), as to their importance to the success of the ID process, (b) a space for comments after each heuristic, and (c) a space to add additional heuristics. Participants were asked to include comments to justify their ratings, to question or clarify a given heuristic, or to elaborate on a heuristic (see Appendix for survey instrument). Panel ratings were analyzed using mean, median, mode, standard deviation, plus interquartile range (IQR). Frequency distributions and graphical representations were created for each heuristic. Heuristics that reached panel consensus in Round I were not included in Round II (Anderson-Woo 2008). A heuristic reached consensus in Round I if either of the following two conditions were met:

  1. 1.

    IQR less than/equal to 1 AND 75% agreement on a rating of 5 and 6 (agree, strongly agree) OR 1 and 2 (disagree, strongly disagree).

  2. 2.

    A 97% frequency rating in the 4, 5, 6 (mildly agree, agree, strongly agree) categories OR in the 1, 2, 3 (mildly disagree, disagree, strongly disagree) categories (97% indicated all but 1 participant).

After the first round, 30 were removed—29 based on the first condition and 1 based on the second condition. Thus, the panel agreed that 29 of the original 59 heuristics were important to the success of their ID practice. For these remaining 29 heuristics, statistical measures (e.g., mean, median, mode, frequency, standard deviation) were included in Round II as well as panel comments made during Round I, allowing participants to reflect on others’ justifications for their ratings. Participants also were presented with their original responses to Round I and asked to either retain their original ratings or modify them based on the new information. Finally, 15 new heuristics were added to the Round II survey, based on suggestions made by the panel during Round I. Thus, in this round, participants rated 44 heuristics and provided comments to support their ratings.

In Round II, the criteria for determining level of consensus were not as restrictive as Round I. That is, we decided to retain the first criterion but to lower the second criterion from 97% to 80% as this seemed more reasonable than the high level set for Round I. In addition, if a heuristic received a 20% or less stable rating from Round I to Round II, it was an indication that participants were not likely to change their ratings enough to come to consensus. To calculate stability, the frequencies of Round I and Round II responses were determined. Following this, the net person-changes (total units of change/2) for a particular question was divided by the number of participants (Scheibe et al. 1975). Using this criterion, seven heuristics were eliminated before the Round III survey. In addition, one of the newly added 15 was split into two statements based on participant comments. Ultimately, 10 heuristics from Round II were included in Round III. No new heuristics were added to Round III.

The Round III survey was created, reviewed, and disseminated to the panel. As before, participants were provided statistical measures such as mean, median, mode, frequency, and standard deviation as well as their original responses. In addition, they could view all panel comments made on the previous survey for each of the 10 remaining heuristics in the survey. Participants were asked to retain or revise their original ratings based on the new information. Consensus and stability criteria remained the same as that used for Round II. Five heuristics reached consensus in Round III.

Data analysis

To compare the relative strength of agreement among the identified heuristics, we compiled all 61 into a rank-ordered list, based on mean ratings. In order to determine the extent to which the final set of 61 heuristics represented core competencies of the field, as identified by IBSTPI (2000), the two researchers independently sorted the heuristics into each category, while also identifying which specific competency (within each category) was best represented by each heuristic (e.g., Professional Foundations—Communication). Heuristics were dual-coded if they seemed to fit more than one category. The researchers then shared their categorizations of each heuristic with each other. Through discussion, consensus was reached on those that initially had been coded differently (18%).

Results and discussion

This study was designed to verify the list of 59 heuristics that emerged from the 2009 study (York et al. 2009) by including a larger number of participants who were working in a wider range of contexts. In addition, we hoped to identify the relative importance of each heuristic to the design process by examining the level of agreement among the participants. Finally, we classified the identified heuristics into the four main categories of IBSTPI competencies (2000) as one way to examine the relationship between the heuristics and key ID competencies.

Heuristics identified as important to ID process

Three rounds of the Delphi process resulted in panel consensus on 61/75 instructional design heuristics (see Table 2), 47 from the 2009 study and 14 added by the panel during Round I. Consensus means that the majority of the panel agreed or strongly agreed that the specific heuristic under consideration was important to the practice of instructional design. Interestingly, of the 16 new heuristics from suggestions by panel members in Round I (15 were suggested and one was split into two), 14 reached panel consensus. It should be noted that the panel identified the various heuristics after being prompted to reflect on each. This does not necessarily mean that these heuristics already existed within the designers’ repertoires of heuristics or even that these were the only heuristics within their repertoires. It is also important to note that agreeing that an heuristic was important did not mean the designers actually applied it in his/her practice. Determining which heuristics were actually used in practice was beyond the scope of this study, but is an important area for future research.

Table 2 Heuristic order based on mean rating of agreement (1–6) as to its importance to the ID process

Relative importance of identified heuristics

Although participants were not asked to rank-order the heuristics according to importance, a rank order was determined based on the mean ratings of agreement from the Delphi surveys. The means for the 61 heuristics ranged from 4.26 to 5.88 (out of 6.0). The means for the top ten heuristics ranged from 5.51 to 5.88 (all were within 0.3 of a point).

Among the top 10 heuristics, participant agreement was equally high (M = 5.88) for the first two: “Know your learners/target audience,” and “Determine what it is you want your learners to perform after the instructional experience. What is the criterion for successful performance?” Of the top 10 heuristics, three related to analyzing the learner/audience, which is a key component of most, if not all, ID models (Gustafson and Branch 2002).

An additional four of the top ten heuristics related to client interaction. Working with a client is one of the major responsibilities of an instructional designer (Liu et al. 2002; Rowland 1993). The most highly rated heuristic regarding client interaction, according to mean rankings, was, “Be honest with the client.” Professional ethics seemed to underscore that heuristic as well as this one, “You need to build trust with the client. This can be done through explaining what you are doing, why you are doing it, and how it is of value to them.” Liang and Schwen (1997) discussed the importance of professional ethics among designers, noting, “expert practitioners consistently demonstrate high ethical standards, which guide their personal and professional conduct” (p. 44). This relates to the definition of educational technology proposed by Reiser (2007), which contains the word ethics, and describes how instructional designers need to “maintain a high level of professional conduct” (p. 6).

Relationship between heuristics and IBSTPI competencies

The IBSTPI instructional design competencies are divided into four categories: Professional Foundations, Planning and Analysis, Design and Development, and Implementation and Management (2000). Divided among these four categories are 23 competencies (see Table 1). In this section we describe how the 61 heuristics, identified by the Delphi panel, align with the IBSTPI competencies (see Table 3). In general, the heuristics identified by the panel were fairly evenly distributed across the four IBSTPI categories (17, 17, 18, and 10 heuristics classified into each, respectively), suggesting that these categories of competencies were considered equally important to the success of the ID process. We discuss each of these categories in greater depth.

Table 3 Heuristic comparison to IBSTPI competencies

Professional Foundations

Seventeen of the 61 heuristics aligned with the Professional Foundations competencies, with the majority aligning with the competency, “Communicate effectively in visual, oral and written form.” Communication is an important component in the instructional design process. The designer must have interpersonal skills that enable him/her to communicate with a number of different stakeholders including the client, design team members, production personnel, and others, all of whom may use different terminology. Communication is an ongoing process. The communication heuristics emphasize how important it is for an instructional designer to know how to communicate with all the people involved in the instructional design process and to do so throughout the lifecycle of the project (McDonald 2008).

Interestingly, none of the heuristics aligned with the fourth competency, “Apply fundamental research skills to instructional design projects.” This suggests that the panel participants either did not engage in the research process or perhaps did not recognize that they used research skills when conducting ID. This does not negate the importance of the fourth competency but simply means that it either did not emerge from the original interviews in the 2009 study or was not suggested as a new heuristic by the panel. Perhaps if it had been included on the original list the panel would have been prompted to consider the importance of applying research skills to ID projects. Future research is needed to determine practitioners’ perceptions of the importance of this IBSTPI competency.

Planning and Analysis

Of the 17 heuristics that aligned with the Planning and Analysis competencies, 10 were associated with the competency, “Conduct a needs assessment,” which is one of the foundations of the instructional design process. This heuristic supports the findings of Rowland and DiVasto (2001) who stated that analysis is one of the “‘big ideas’ that designers use when engaging in design work. The 14 experts in the Rowland and DiVasto study all agreed that the instructional design process includes “thorough analysis, for example, of learners, task, and setting” (p. 14). In addition, three of their experts claimed that, in general, not enough analysis takes place. Their statement is supported by the heuristic, “Invest as much time as you can in your audience analysis.” The heuristic, “Ask yourself, ‘Is instruction the solution to this problem?’” also supports Rowland and DiVasto’s (2001) findings that designers use needs analyses to determine “when instruction was the right answer” (p. 15). This heuristic is supported by the general ID literature (Gustafson and Branch 2002). Determining if instruction is necessary tends to be one of the key things a designer must do after first meeting with the client.

Design and Development

Among the 18 heuristics that aligned with the Design and Development competencies, 7 aligned with competency 13, “Select, modify, or create a design and development model appropriate for a given project” as well as competency 17, “Design instruction that reflects an understanding of the diversity of learners and groups of learners.” The heuristic, “Generate multiple possible solutions that will solve the problem,” supports findings of Liu et al. (2002) who suggested that instructional designers must use their best judgment in creating a solution for the client. While Lui et al. did not specifically mention developing multiple solutions, because ill-structured problems typically have multiple solutions, the designer needs to decide which is the best to recommend to the client (Jonassen 1997). The heuristic, “When designing instruction, consider active learning,” supports Mayer’s (2003) recommendation for using different methods to promote active learning, even when using non-interactive media. Participants also agreed that scaffolding was needed, but disagreed as to the timing and quantity of the scaffolding. One participant felt scaffolding should be used in the beginning and tapered off near the end, while another participant felt it was necessary to hold off on using scaffolding in the beginning. The instructional design literature supports the concept of using scaffolding, but the use of it depends on the context (Van Merriënboer et al. 2003).

Implementation and Management

Among the 10 heuristics that aligned with the Implementation and Management competencies, 8 supported competency 20, “Promote collaboration, partnerships and relationships among the participants in a design project.” The panel strongly supported the idea that design is a people process (as one participant stated). “The team is critical. Involve the right people at the right time” ranked 12th among the 61 heuristics. A great deal of the instructional design literature supports the notion that instructional design is a team process (Greer 1992; Liu et al. 2002; Rowland 1993). The most highly rated heuristic related to the design team was, “Figure out who all the stakeholders are in the room. And figure out who is not in the room that is still a stakeholder.” This heuristic supports Cox’s (2009) suggestion that the designer needs to conduct a stakeholder analysis to determine who the stakeholders are throughout the project. Carroll (2000) elaborated on the purpose for a stakeholder analysis: “Various stakeholders or team members may change their interests or priorities, or may even leave the team. Others with unknown interests and priorities may join” (p. 47). These statements suggest that the designer needs to continually reassess who the stakeholders are throughout the project.

The heuristic that proposed that reviews by subject matter experts were a necessary component of the instructional design process supports Keppell’s (2001) findings. Often, the content being designed is quite unfamiliar to the designer, whereas the subject matter expert brings that specialized knowledge to the table. Keppell described an iterative process of explanation and clarification that occurs between the designer and the subject matter expert throughout the design process.

Implications and conclusions

Recent conversations among ID scholars and practitioners have questioned the efficacy of teaching ID models to novice designers, based on the fact that, in practice, models are applied neither consistently nor uniformly (Wedman and Tessmer 1993; Dick 1996). While some have argued that we should continue to teach models to novice designers due to the foundational knowledge they provide (Dick 1996), others believe we should be teaching relevant skills, such as problem solving (Jonassen 2008), communication (Summers et al. 2002), and project management (Williams van Rooij 2011). Yet, what are the specific skills that students should learn? Although this question is likely to engender a great deal of debate among ID professionals, the results of this study suggest that the heuristics practitioners believe to be important to the ID process are relatively well aligned with the IBSTPI competencies. So, while these include a number of competencies that relate to steps in ID procedural models (learner analysis, design strategies, etc.), the focus is broadened to include a larger group of skills such as those related to communication and management.

Participants in our study did not suggest adding any heuristics that related directly to an ID model. Instead, participants seemed to focus more on the practice of ID and what it takes to be successful, which of course, goes beyond applying the steps in an ID model. This is similar to what Ertmer et al. (2008) found: although the participants in their study did not talk specifically about ID models, they applied these mental models to their practice. Still, it is important to remember that we did not ask participants if they learned about or used a specific model in their practice. Different heuristics may have emerged had we asked participants to rate the importance of completing various steps in the ID model. In future research we should consider determining if and to what extent practitioners use/consider a specific model (or steps in the model) when working on design projects.

To address our research question, “How do heuristics relate to key ID competencies,” we considered how the heuristics identified by the Delphi panel related to essential knowledge and skills needed to be successful in the field, as identified by IBSTPI (2000). We chose these competencies because they appeared to be generally recognized by our professional organizations (e.g., AECT and ISPI), as being important to our graduate education programs. As such, we made the assumption that our panel participants had learned at least some of these skills during their graduate programs. As noted earlier, practitioners have reported that they tend to “treat theoretical principles as heuristics” (Perez et al. 1995, p. 340). However, it’s important to consider that the opposite may be true. That is, the IBSTPI competencies (2001) may simply capture, rather than prescribe, what practitioners need to know and be able to do. If this is true, then the heuristics identified in this study and the IBSTPI competencies might be expected to look very similar. In other words, both the heuristics and the IBSTPI competencies may be capturing common knowledge shared by practitioners. However, given that we did not probe into the actual source of these heuristics, additional research is needed to sort through these various possibilities.

In this study, our participants did not mention specific ID models or refer to the IBSTPI (2000) competencies by name. Instead, they appeared to approach their task of identifying important ID heuristics from a very practical perspective: What does it take to be successful in this field? This is also supported by Visscher-Voerman’s (1999) findings. That is, of the 16 principles identified by her participants, only one mentioned the use of design models: “Successful design is served by the use of step-by-step schemes and design models, provided that they are adapted” (p. 173). And similar to what others have reported (Wedman and Tessmer 1993), the practitioners in her study also emphasized the need to adapt the models learned in school.

Although in this study we examined how the 61 heuristics identified by the Delphi panel aligned with the IBSTPI competencies, this is not to suggest they don’t incorporate steps from ID models, as well. It is not our intent to discount using ID models in the education of our students. Models provide novice designers with a starting point when beginning a new project. However, it also might be useful for novice designers to understand how experienced practitioners translate these models and competencies into practice. Heuristics appear to offer one possibility; that is, heuristics could potentially serve as scaffolds for novices who are unsure how to translate their textbook knowledge into practice. Furthermore, if these heuristics appeared in the form of stories, they could, as Jonassen and Hernandez-Serrano (2002) noted, “support a broader range of problem solving than any other strategy or tactic” (p. 65).

Limitations and suggestions for future research

This study is based on the assumption that designing involves the application of heuristics. However, we did not ask the panel members to report their frequency of use for each heuristic, nor did we ask about the context in which they used the heuristics. Believing a heuristic is important does not mean, necessarily, that it is applied in practice. It is important to determine which heuristics are being used and in what context. This will comprise our future research.

The heuristics identified in this study aligned relatively well with the IBSTPI competencies (2000). Although some competencies were not represented by the set of heuristics identified by our Delphi panel, this does not negate their importance. It simply means they did not emerge from this panel’s deliberations. Future research is needed to determine practitioners’ perceptions of the importance of all of the IBSTPI competencies.

In addition, the relative importance of different heuristics to the instructional design process (as well as to instructional design field) needs to be examined. A more thorough examination of importance of the different heuristics should be undertaken. Perhaps a Delphi panel could rank order the current heuristics as to their importance to the instructional design process. It could also be productive to ask new instructional designers questions about their experiences such as: (a) What was the most important thing you learned, and (b) About what do you wish you learned more? From this we could tailor our graduate education to include elements practitioners find important versus what the instructor or textbook emphasizes as important.

This study was designed to generate a thoughtful analysis of heuristics used by experienced instructional designers. However, one limitation to this study was that the fields in which the instructional designers worked did not represent all possible areas of work for instructional designers; for example, designers working in the military were not included because we did not have access to any. Future research will provide military instructional designers with heuristics to determine if designers in different ID fields have the same perceptions about which heuristics are important to the design process.

Future research will also focus on determining the best methods for sharing the resulting heuristics with novice designers and whether it impacts their initial experiences as instructional designers. Some questions we plan to pursue are: (a) Can we teach heuristics to novice instructional designers? (b) What methods should we use to provide this information (stories, cases, guest speakers)? and (c) How does this impact their practice?