Introduction

Learning is a journey with the teacher as a guide (Kimbell 2007, p. 248).

The quotation above highlights teachers’ professional responsibility to support their students during the process of learning. As a guide you have to find a starting point suitable for the group you are guiding and take it from there to plan their future progress towards the chosen destination. Travellers’ starting points may differ, so in order for each traveller to make progress she or he has to start from where they are. The same applies to education; the teachers have to find out where their students ‘are’ and take their instructions from there when planning future learning situations in order to succeed in getting the students to the desired destination (in time). Teachers commonly spend a lot of time and effort to, in various ways, establishing an understanding of the level of knowledge/skills acquired by their students (Kimbell 2007). Hattie (2012) argues that this takes too much time and effort instead of being able to continue from where the prior teachers left. Accordingly teachers are in need of support in order to enhance their assessment confidence as well as assessment.

In 2008 Swedish teachers were introduced to a new assessment tool: the mandatory individual development plan (IDP) with written assessments document. When introduced the IDP document was presented as a tool in teachers’ follow-up of their students’ performance including each student’s current state of knowledge (provided by the teacher) as well as suggestions (designed in consultation with the student and her/his guardians) on ‘where to go next’ in order to facilitate future learning for the student (NAE 2009a). In this article

  • the focus lies on technology education in primary school,

  • the design and content of authentic IDP documents in technology are studied,

  • teachers use of IDP documentation is discussed.

Background

Technology: a side-lined school subject

Christer Fuglesang, Sweden’s first astronaut and a member of the government-appointed Teknikdelegationen have summarized the state of technology education in compulsory school in Sweden, in the following way:

Technology is everywhere, except in school. (Fuglesang 2010)

In Sweden technology education is one of 16 mandatory subjects in the 9 years compulsory school. It has been a mandatory subject for more than 30 years. Nevertheless it is, according to numerous reports, still lacking a strong teaching tradition and roots (ASEI 2005; Fabricius et al. 2002; SSI 2009; Teknikdelegationen 2010). Some of these reports even state; that instruction in technology, is not offered to students in sufficient quantities to give them the opportunity to reach the goals in the national curricula. It is noteworthy that these reports refer to the quantity of teaching—not the quality. The above-mentioned reports unanimously state that the situation is most alarming, in particular in the early years of schooling. With respect to primary school technology education ‘nothing much’ seems to be going on at all. Regarding the quality of the technology education actually performed (in compulsory school), findings reveal variation in quality, and for example, instructions are not always aligned with the current steering documents (ASEI 2005; Bjurulf 2008; Blomdahl 2007; Klasander 2010). One suggested reason for this ‘variation in quality’ is the lack of certified technology teachers (ASEI 2005; Nordlander 2011). Today the majority of teachers teaching technology are not trained in the subject (Hartell and Svärdh 2012).

Teacher’s view of the technology subject has consequences for the assessment of it (Black 2008; Moreland et al. 2008). Even though the subject has been mandatory for the last 30 years, within the Swedish context, consensus regarding content and practice of the subject has not yet been set (Norström 2011; Skogh 2006). The technology subject is not well defined. On the contrary the Swedish national curriculum is, according to Norström (2011), open for interpretation, which, to a greater or lesser extent, may affect the interpretation of pupils’ progress as well. The understanding of pupils’ progress is vital for teachers according to Hattie (2012) and in this context the IDP could contribute as a tool for tracking students’ progress—an educational GPS-device so to speak (Hartell 2010).

Unknown goal fulfillment in technology

Today we know very little about what Swedish pupils are learning from technology education in school and more research is needed (Hartell 2012). The level of student attainment in technology is not easily described or interpreted. On the one hand national statistics state that the goal fulfilment in technology (at the very end of compulsory school, year 9) is among the highest of all mandatory subjects (Hartell 2011). On the other hand numerous reports (see previous section) have revealed serious deficiencies regarding the quality (and quantity) of instruction in technology offered in schools. The situation is confusing, to say the least.

The need for a national follow-up on technology education in Sweden has been highlighted by several stakeholders e.g. (Teknikdelegationen 2010). In 2003 the National Agency for Education (NAE) initiated an evaluation (NAE 2004) of all mandatory subject matters, the NU03-report, in compulsory school except technology, home language/mother tongue and modern language.Footnote 1 The exclusion of technology and SpanishFootnote 2 in this evaluation is commented on by the NAE in the following words:

About Spanish and Technology the level of knowledge is unclear, which motivates a complementary evaluation.

My translationFootnote 3 p. 29 (NAE 2004).

There is, however, an important distinction among the excluded subjects worth highlighting. In contrast to home language and third language (which are voluntary), technology is a mandatory subject for all students in the Swedish compulsory school system.

Teacher-based assessment

The Swedish monitoring system is based on individual teacher assessments of the students’ positions in relation to defined goals in the curricula. All students are expected (and entitled) to exceed the targets specified in the curricula. The teacher interprets the syllabus, plans the tutoring and assesses the students’ performance in relation to the stipulated targets. Thus each individual teacher is allowed to use her/his own methods and strategies regarding their teaching and assessment activities (Klapp Lekholm 2010).

Need for assessment guidelines

There is always a risk of teachers being influenced by their own misconceptions and prejudices regarding their students’ abilities. Such misconceptions may blur the interpretation of the students’ positions and learning possibilities, resulting in limitations regarding the students’ opportunities for progress (Gipps 2004; Kimbell 2007; Rosenthal and Jacobson 1968). Teachers might, for example, expect that their students have reached a certain ‘spot’ before (or after) they actually have or that their students will be travelling towards their learning goals at the same speed as the teacher is teaching (Kimbell 2007; Wiliam 2011). Teachers need to be aware of the difficulties of assessment; this awareness is crucial prerequisite allowing them to deal with assessment properly.

The conclusions stated in the previously mentioned NU03 reports are clear. Teachers in Sweden are in great need of support in interpreting the different syllabuses and also in formulating subject content and assessment criteria based on the syllabuses. As a consequence of the NU03 findings, teacher aid material concerning planning and assessment was compiled by the National Agency for Education (NAE 2008a, b). Most subjects in compulsory school were covered by this effort. Technology education was unfortunately excluded from this initiative.

To moderate teachers’ assessment and grading, external national tests in selected subject areas (mathematics, Swedish, English and science) have been provided by the NAE. Again, technology has been excluded. The aim of these national tests is to provide guidance to the teachers before they make their final decision regarding what grades a student will receive. The national tests are externally produced, but no external referees are involved in the final grading procedure. Accordingly the teachers themselves undertake the final grading of these tests. The intention is not for the results of the national tests and the final grade to be perfectly consistent and reports show that there is a considerable difference between the grade achieved on the test and the final grade (Forsberg and Lindberg 2010; NAE 2010a; SIRIS).Footnote 4

In general, Swedish teachers lack training in assessment (Lundahl 2009). Some interventions have been undertaken by NAE and occasionally locally initiated interventions (within municipalities) have also been offered to teachers. The target groups for most of these interventions have been teachers in secondary school, as they are obliged to grade their students from year 8 (Fagerlund and Högberg 2010). Consequently, the teachers targeted in this study (primary school teachers) have seldom been included.

Information about student progress

Regulations state that students/children and their guardians are to be continuously informed about children’s progress in school. To support this interaction a teacher-child-guardian meeting is to be scheduled at least once every semester from year 1 to year 9 (NAE 2009a).

The purpose of these meetings is (1) to provide information to all concerned parties about the current status of the pupil (achieved skills/knowledge) and foremost (2) to jointly plan for the future progress and well being of the student. Regarding the outcome of these meetings, research indicates that the intended purpose is not always met (Hofvendahl 2006; Vallberg-Roth 2010).

The individual development plan with written assessments

In 2008 all Swedish schools were supplied with a new assessment device; the individual development plan with written assessments (IDP) (NAE 2009a, b; SKOLFS2009:16). The use of the IDP assessment device is mandatory for all students in compulsory school from school year 1 through to year 9. The main motive behind the introduction of written assessment was to improve clarity in the communication/information about student knowledge development in school. This was seen as a necessity in order to secure opportunities for students to develop to the limit of their capacity and the overarching aim for the IDP document is to enhance possibilities for student learning (NAE 2010b).

The IDP document is to be completed by the teacher in writing, together and in agreement with the guardians, at least once every semester during the teacher-student-guardian meeting. The actual design of the IDP template used on the school, is to be decided on locally by the local school’s principal. However, regulations state that the IDP must (1) include a description of the current educational position (learning status, skills, etc. for each student in all subjects given) and (2) provide a description of the suggested strategies regarding how the individual student should proceed towards the goals set in the national curriculum and syllabuses. Thus the aim is to both find the current starting position and suggest alternative ways to make further progress towards the wanted destination, that is, to reduce the gap between the current and the desired position compared to the national curricula. This is public document and thus must not include any confidential information that could be of harm to the student and shall continuously be adjusted to the needs of the individual students, that is, it is foremost intended to be formative (NAE 2009a, b). Hirsh (2011) categorize the IDP document as ‘long cycle’ formative assessment, which according to Wiliam (2009) is the least effective cycle. Another aim is for the teachers to find starting points on where their students currently are to be able to plan their future teachers from there. The information in the IDP document is allowed to resemble the grades, which are given in lower secondary school. However, as a measure of the individual’s performance the IDP document obviously is not comparable among schools or individuals (NAE 2010b).

Previous research

Assessment is a growing field of interest in Sweden and internationally. However, research, and in particular research focusing on teachers’ assessment practices, is so far limited (Forsberg and Lindberg 2010; Lindberg 2005).

Assessment for learning and formative assessment

Assessment where the primary purpose is to bring the student forward in progress i.e. to alter the gap between the current position and the targeted one is commonly referred to as formative assessment (Black and Wiliam 1998; Kimbell 2007; Wiliam 2009). International research is concurrent in its conclusion about the positive effectiveness of formative assessment for students’ progress (Black and Wiliam 1998; Hattie and Timperley 2007; Hattie 2009; Kimbell 2007; Leahy et al. 2005; Wiliam 2009). Educational assessment has and still has and most likely will continue to have many different meanings, aims and purposes (Hartell 2012; Newton 2007). However the aims and purposes of assessment may differ, when the purpose of the assessment does not include purpose of the pupil’s future progress, one must question the usefulness of it (Bennett 2011; Gipps 2004; Hartell 2012; Newton 2007; Pettersson 2009).

Thus it is time stop arguing about whether to assess or not. Instead, it is time to start asking questions about how, for what purposes and with what consequences we assess (Vallberg-Roth 2010). The change from quality control to quality insurance will shift focus from assessment of learning to assessment for learning is needed. Or, rather, the combination of both but with the emphasis on the latter is needed. To support teachers’ work with assessment a new assessment device, the Individual Development Plan with written assessments, was introduced in 2008 to be used for all students in compulsory school (NAE 2009a, b).

The concept of formative assessment has been misused and misinterpreted. It is found in various contexts and often used in a simplified way (Bennett 2011; Wiliam 2009). In order to clarify concepts, some suggestions regarding the definition of formative assessment have been offered. Wiliam (2009) suggests a difference between formative assessment and assessment for learning with respect to purpose and function. This distinction is important. There is, according to Wiliam (ibid.), a difference between the intention of gathering evidence of learning and the results from gathering evidence of learning, that is, if the gathering of evidence is used or not used and if the evidence shows that the learner has moved forward or not. The need for concept clarification is also required in order to prevent simplistic implementation and disregard of the difficulties (Bennett 2011). To summarize:

The term ‘assessment for learning’ speaks about the purpose of the assessment, while the term ‘formative assessment’ speaks about the function it actually serves (Wiliam 2009, p. 9).

Cycles of formative assessment

Wiliam (2009) suggests three different kinds of formative assessment, each representing different time aspects. He introduces the concepts of long, medium and short cycles of formative assessment. These different kinds of formative assessment are obviously also found in the Swedish school system. The previously mentioned IDP document is an example of a long-cycle formative assessment, since it is generally presented once every semester (Hirsh 2011). Tests performed during or after a theme or work period are examples of a medium cycle of formative assessment. Finally, Wiliam refers to short-cycle assessment. According to Wiliam (2009), short-cycle assessment (teachers’ day-by-day and minute-by-minute assessments in the classroom) is the most effective type of formative assessment. Both Black (2008) and Moreland et al. (2008) state formative assessment as an integral part of technology education. Within the Swedish context Hartell (2012) showed in a classroom study that teachers in their daily practice perform short-cycle assessment in each and every lesson, for example, by asking questions or ‘looking for glimpse of understanding in their eyes’ with the intention of moving their students forward in technology. Hartell also found that teachers were alone when planning, executing and following up the students’ progress in technology which is opposite to what Hattie (2012), Kimbell (2007) and Wiliam (2011) argue for; the importance for teachers to plan assessment in advance and in cooperation with others.

Research about IDP with written assessments

The IDP document is a Swedish phenomenon. However, there are similar assessment documents in other countries. They are sometimes called educational plans or learning plans (Hirsh 2011). The difference between the IDP in Sweden and similar documents occurring in different countries is that the Swedish phenomenon is mandatory for all students, not just for those who are in need of special support. The fact that the IDP document is compulsory for all students has implications on the quality of this document, such as, for example, the need for time to spend working with the document; time which, internationally, has been identified as crucial when working with these kinds of documents within the international context (ibid.). As a student will receive approximately 270 written assessments during their 9 years of compulsory schooling, the effect versus possible time spent is questioned by Vallberg-Roth (2010). The question of storing IDP documents (legally public documents) locally in a secure way is, in this context, also of great importance (Andersson 2011).

No previous study has focused on the use of IDP in connection with technology education. However, the difference between written assessments in theoretical and practical subjects is discussed in an evaluative report from NAE (NAE 2010a, b). According to this report, written assessments in theoretical subjects are commonly considered to be objective and goal oriented in comparison with those in practical subjects, in which student personality and behaviour are more frequently highlighted by the teachers. The NAE report concludes that most of the samples collected include summative and not formative assessment. The NAE highlights the absence of links between current steering documents and the IDP in general. This is identified as an important ‘area of improvement’ (ibid.). A number of different IDP forms/templates (some excluding one or more school subjects) are identified and examined in the report. This variance in the number of subjects included in the written assessments is questioned and the limited amount of written assessment in, for example, science in the sample is especially questioned. In the context of this study it should be noted that written assessments in technology are, due to their shared timetable, reported together with science in the evaluative NAE report.

When highlighting the question of teachers’ wording in future planning targets in IDP-document Hirsh (2011) identify three categories of targets; being, doing and learning. In a majority of these IDP documents (three-fifths) the targets set for students are connected to the students’ being. The students are supposed to ‘change their behavior’, their ‘ways of being’ or their attitude—remarks undoubtedly connected to the student’s self as a person (Hirsh, 2011). Such comments about personality appear frequently in the documents, with an evident risk of violation of the privacy of the child (Vallberg-Roth 2010).

Aim and research question

The aim of this study is to explore teachers’ use of the IDP document in compulsory school in the mandatory school subject technology education:

How is the IDP document used by primary school teachers in their follow-up and future planning of their student’s knowledge development in technology?

Method

This is a qualitative study in which authentic IDP documents are studied. Within every municipality in Sweden every school principal decides what specific IDP template is to be used in her/his school. This makes comparison between, as well as comprehensive studies within, the municipalities difficult. This type of comparison is not the intention here.

The research question has been addressed by analysing authentic documents. Below the design of the study is presented.

Data collection

Selection of municipalities

Authentic samples of IDP documents with written assessments used in primary schools were collected from three municipalities for the purpose of examining the content in relation to the mandatory school subject of technology. To increase the scope, these data were then supplemented by a sample of IDP documents from two other municipalities, collected by Hirsh (2011). In total, 351 IDP documents from five municipalities constitute the basis of this study.

The municipalities chosen were classified according to the categorization provided by the region and municipality employer’s organization SALAR (www.skl.se). A number of different types of municipalities (see “Appendix 1”) were found in the sample. Other differences were also found; for example, the amount (140) of ninth graders in one of the municipalities is the same as the amount of ninth graders in one single school in another municipality. Even though this is not a quantitative study, the covering of different kinds of municipalities and regions increases the validity of the study.

Reliability of the data

Even though the IDP is a public document they are hard to come by. The first data scope was dependent on my gained access to informants who could gather illustrative samples from school year one to school year six. To secure the preliminary findings from this first data scope, a larger sample was needed. To save time, IDPs gathered by a fellow researcher (Hirsh) was added. These additional samples cover all the schools in two additional municipalities D and E in school year three and five. The data includes different amount of IDP documents from different municipalities and schools. It should be noted that no claim of having a representative sample for either the country or the involved informants (municipality/school level) is presented here. Nevertheless, the sample contains different school years (year 1–6) and different kinds of schools in different municipalities in diverse locations and settings (c.f. “Appendix 2”). The sample is an illustrative sample of authentic IDP documents.

Processing the data

The section below will provide a description of how the collected data have been handled and analysed.

I

Repeated readings were undertaken of all samples looking for patterns regarding similarities and differences, for example, school name, teacher names, design, goal formulations, subjects included in the IDP document. Fourteen different versions of templates were found and classified into five categories on regards of the design, e.g. box to tick, goals formulated etc. c.f. Table 1.

Table 1 Summary of the findings. Those templates where technology is included are written in bold

II

The templates were then reviewed again for information regarding student progress in technology. The first approach was to investigate if and how goal fulfilment was reflected in the written assessments with the intention to study in what way teachers express their knowledge about the students’ positions and progress in technology, learning wise. However from the preliminary results (see further down) the plan was changed. Instead this second-level review focused on the presence of the word teknik (the Swedish word for the technology subject) in order to find exemplars of written assessments in technology. The text analyses was then qualitatively undertaken when interpreting the data with myself as my instrument (Bresler 2006). Interpreting the information provided in the document by using my own prior knowledge as a teacher in technology and by that going beyond the mere label teknik looking for information regarding content and information about student progress in technology according to the National curricula (NAE 2009b) in the other subjects that was included in the IDP documents. Two groups were found: (I) templates including teknik and (II) templates not including teknik.

III

The five templates including teknik were then examined, again using my prior experience as a technology teacher and knowledge about the curricula, more deeply in search for both the explicit and implicit content of the templates such as direct and indirect information about (I) the student’s current ‘position’ and (II) suggestions regarding strategies for further progress in technology. The results from this phase of the analysis process were discussed with teacher colleagues and fellow researchers (second reader procedure).

IV

Finally, the findings were questioned and compared with previous research about technology education, formative assessment and IDP documents in order to understand the findings.

Ethical considerations

The ethical regulations presented by Vetenskapsrådet (2005) have been carefully followed and all names (municipalities, persons and schools) have been anonomized.

Results

The results presented are based on samples of authentic templates of IDP with written assessments gathered from the five municipalities in school years 1 through 6, during school year 2008/2009. It is an illustrative sample (14 templates including 351 samples of documents). The amount of samples from each template varies and is not statistical but an illustrative collected data. In this section the results will be presented, together with some complementary explanations regarding how the study was undertaken.

The IDP template–part I: the design

According to the regulations surrounding this document the design of the IDP template is to be decided upon locally. In this sample the design of the IDP documents varies. There are recurring elements (selecting the level of knowledge by ticking in boxes, designated areas/space for teachers’ comments on a limited scale) and from the data fourteen different types of templates (T1–T14) were identified. They were divided into different categories due to similarities and differences of the design see Table 1.

Written assessments are to be given in every subject given. Technology is included in five and excluded in a majority (9) of the identified 14 ‘types’ of templates.Footnote 5 Some (T2, T6, T12; T13; T14) of the templates include standardized formulated goals, which the teacher can tick in a box on a level of achievement for the student. The amount of goals to be ticked varies between three and fourteen among the different subjects that is included in the collection investigated here. In technology the amount of goals varies between three and four.

These templates are also designed to include space where teachers can write comments regarding individual student achievements in combination with standardized goals. Individual comments to students and information on where to next appear in all the templates and in different subjects, with the exception of the T1-template. The T1 template only includes a box to tick. However, in those templates that include both standardized goals and option to write individual comments; technology comments to the students cover all students in a standardized way (when present, that is) and are thus not directed to the individual, which they are in other subjects. Where the option is free for the teacher to formulate themselves without standardized goals (T3, T4, T7; T8, T9; T10, T11) there is no information regarding technology found.

Every student in are entitled to knowledge stipulated in the curricula and according to the reported goal fulfilment all students with very few exceptions achieve the goals in technology. Ten students (of 240) were identified in the T1 template as not to achieve the goals in technology. However they did not achieve the majority of the other subjects either. No information regarding on what grounds the assessment was undertaken or on strategies regarding where to next was found.

In Appendix 2 there is a thorough description of each of the templates.

The IDP template–part II: information about the student’s position or further progress in technology

A closer look for information about student knowledge development and progress in technology was undertaken in those five templates (T1, T2, T4, T13, and T14) that included teknik (technology).

Technology is included in five of the fourteen ‘types’ of templates identified in this study. In template type T4 technology is mentioned together with the science subjects. There is no information targeting achieved skills/learning outcomes in technology in this type of template. In another type of template (T1) technology is found under its own heading together with a designated box to be ticked by the teacher indicating an achieved goal. There is, however, no information regarding the goals in question. In template types T2, T13 and T14 technology has its own heading. There are designated boxes to be ticked by the teacher indicating achieved goals. In these types of templates standard formulations regarding the goals in question are included (cf. template type T1). One of the standardized goals cited for all students reads as follows: ‘Can follow an instruction and make simple construction with Lego and simple material.’ In most types of templates there are designated areas for the teachers’ comments. In technology this space is left blank in the studied scope of data, with one exception. One student received the following comment in Template 14: ‘John is a clever and a good—builder—on e.g. Lego.’ This is the only additional comment in technology, directed to a specific student, found in the whole scope of data. No information on how to make further progress in technology is found in either of the documents. It should be noted that such comments are frequent regarding achievements in all other subjects in the same template.

In summary Technology is almost completely invisible in the studied IDP documents. Only five out of fourteen types of IDP templates in the sample studied cover technology education. In the cases in which technology is included in the template, the use of standardized wording regarding goals and goal fulfilment are frequent. With the exception of one document (out of 351 pc studied), no comments regarding individual student’s achievements in technology could be found. None of the studied templates include suggestions regarding strategies for further progress. See Table 1.

Discussion

The purpose of this study is to explore the possible usefulness of the IDP document for teachers in their future planning. The simplest direct answer to the research question put forward in this article (How is the IDP document used by primary school teachers in their follow-up and future planning of their students’ knowledge development in technology?) would be ‘It is not used at all’. However, a deeper discussion is needed and is presented in the following sections.

The constraining design of the template

When you are teaching, you have to start from where your students are (Kimbell 2007). When locating a student’s current position in her/his learning journey the teachers cannot assume that the student does not know anything about things the teacher has not yet taught to the class. It is more complicated than that (Kimbell 2007; Wiliam 2009) and still this is what Hattie (2012) argues as the most important thing for the teacher to do. In theory, the IDP could be such a systematic tool, to identify student’s starting point and from that, plan future progress. However the results presented here show it does not match the actual possibilities to do so. This possibility is restrained in reality due to the current design of the IDP templates. The initial purpose of introducing the IDP document to all students was to provide information about where the students are ‘positioned’ compared to the curriculum goals. Unfortunately, this has not been the case in technology education.

The IDP documents investigated are, I argue, useless to technology teachers in their present form. They provide no support whatsoever for teachers wanting to find out where the students are ‘positioned’ in technology. The lack of space designated for technology in the templates investigated makes it problematic for teachers who want to adjust their future technology teaching to the needs of their students to do so using the IDP as at tool. When the design of the template excludes the technology subject, this in itself is a powerful constraint. This not only restricts the use of the IDP—it also conveys unfortunate signals regarding the importance and status of technology education in Sweden. Discussions regarding this issue are much needed.

Written assessments are to be given in all subjects tutored. The results presented here are consistent with the NAE report (2010a) regarding the issue of all school subjects not being present in the IDP documents. The study presented here is a valuable contribution to the study undertaken by NAE (2010a) where the technology subjects grouped together with the science subjects (biology, physics and chemistry). Thus no results presented in the NAE study (ibid.) are exclusively valid for technology education. Even though NAE are grouping technology with science, some comparisons may be drawn. For example the NAE questions the limited amount of examples of written assessments in science (thus also technology) in their data. This can be seen as a confirmation of the results presented here. Even though this study contributes to the field I argue that there is great need for further considerations concerning the technology subject, in particular regarding support for teachers (e.g., in-service training, guidelines, locally initiated peer reviews among colleagues). Based on the findings of this study, I strongly argue that both the pros and cons of combining science and technology need to be questioned. Technology should be present on equal grounds not only in IDP documents but also in all kinds of interventions, evaluative reports, steering documents and regulations (including teacher training). The expression “… and technology” and the routine of unreflectively enclosing technology within the science subjects (sometimes excused by the shared time table) sends unfortunate (and inaccurate) signals about the status and importance of technology in school and, by extension, in society.

The regulations state that design of the IDP templates used is decided on the school level; however, some municipalities use the same template in all of their schools. The NAE (2010a) highlights the question of standardized IDP forms and the results presented here can add to this discussion. In order for teachers to be able to inform on progress in technology they need to have the possibility to do so also within the IDP template. The results presented show that in a majority of the IDP templates studied; this is not possible. The exclusion from the templates of any given subjects restrain the possibility for teachers to provide information regarding student progress, that is, the design of the template prevents them from being able to do their job. So to add to the discussion I argue that the inclusion of all subjects (even technology) in the templates should at least be a minimum requirement!

In summary The findings from this study show that the design of the IDP template is important as it restrains the teachers. A majority of the templates studied do not cover all subjects and are, in their present form and in most cases, useless in technology.

Contradictory goal fulfillment

Several reports show the unsatisfactory situation for technology in compulsory school (e.g., ASEI 2005; SSI 2009). Despite these reports, the official goal fulfilment in technology (year 9) is reported to be among the highest of all subjects in year 9Footnote 6 (Hartell 2011). The findings from this study provide some interesting (and contradictory) evidence regarding this matter. As previously mentioned, a majority of the IDP documents examined in this study do not include any information on technology at all. However, when technology is included it provides very positive goal fulfilment in the primary years of schooling. The vast majority of the students reach the goals, according to the information given. This promising information contradicts the previous reports regarding the situation for this subject. However, findings in this study reveal that all students receive the same information about their personal achievements in technology (stated in their respective IDP documents) and no individual information is given in their personal IDP documents. According to the information given in the documents studied, not all students reach the targets. In the T1 type template, for example, 10 students failed to achieve the goals in technology. The information given in other subjects showed that none of these 10 students achieved the goals in a majority of the other subjects either. This result raises interesting questions for further investigation. On what grounds were the assessments made? And how were they enacted in the classroom?

In summary Prior reports, evaluations and research are conclusive that technology education is not occurring in compulsory school, especially in the primary years of schooling. Still, the goal fulfilment reported is excellent by the end of compulsory school. Results presented here show that the reported goal fulfilment in the early years of schooling is also excellent. The actual goal fulfilment is, according to findings in this study, questionable. All students receive the same information about their personal achievements in technology (stated in their respective IDP documents) and no individual information is given in their personal IDP documents.

The impossible use of IDPs in planning technology teaching

The NAE report (2010a, b) shows that most IDP documents include some suggestions regarding what needs to be done to move the learner forward. This is confirmed in all subjects in the sample presented here with one exception: technology. The information given in technology did not provide any suggestions regarding what to do next. It is unfortunately not possible to compare this result to previous results (Andersson 2011; Hirsh 2011; NAE 2010b) since these studies include technology within the science subjects. This gives rise to yet another question that needs further investigation: Why do teachers, who are given the possibility to comment on student achievements, do so in all subjects but technology? Based on the findings, I argue that this highlights the need for teachers to have access to consistent language concepts relevant to technology teaching. I also argue that there is a need for collegial discussions about subject content, teaching strategies, work methods and assessment criteria in teknik. Such discussions have been asked for by teachers (Hartell 2012) and should, acording to regulations be initiated and supported by local and national school authorities (NAE 2009a, b; 2011).

In summary In the IDP documents, teachers commonly comment on the achievements of their students. This is true in all subjects except technology. It has been suggested that teachers need access to consistent language concepts relevant to technology teaching. This suggests that teachers should be given the opportunity to meet regularly for collegial discussions and reflection about technology teaching, in accordance with the regulations.

Questioning the ‘formative’ IDP

The statutory intention of the IDP is to be formative. In order to bring some order about the concepts regarding formative assessment, Wiliam (2009) stresses the difference between the purpose of the gathering of evidence and the actual use of evidence, also including the time elapsed between gathering and use. In order for the IDP document to be formative, the information gathered has to be used to move the student forward. In order for the evidence to be used, the information about students has to be visible and significant. The results presented here show an evident lack of any information whatsoever that could possibly be used in relation to technology. From this I argue that the IDP document in this respect is not formative. Rather, the findings reveal that it does not seem to be possible to use the IDP in a ‘formative way’ in technology.

Based on the results presented, I argue for a discussion regarding the effectiveness of IDPs in general. Pursuant to Wiliam (2009), the IDP document belongs, according to Hirsh (2011), to the long-cycle category of formative assessment. Thus the IDP belongs to the least effective cycle of formative assessment, which according to Wiliam might be useful on a more strategic level but not with the level of impact on the students learning as the short cycle of formative assessment. Even so a document is not formative in itself and nevertheless, it has to be possible to use the information to better meet the students needs, i.e. the assessments made has to be formative in its function. Teachers’ tacit views on learning do influence their teaching (Black 2009) and they are more likely to articulate their points of view when it comes to learning when they are designing assessment tools (Black and Wiliam 2009; Elwood 2008; James 2010). The reasons for designing the IDP documents the way they are, is therefore indeed a relevant question and a possible continuance of this study.

Finally, even though the issue of time is not a focus of this study, the need for discussions regarding the amount of time teachers commonly spend working with the IDP document as a part of their formal assessment practice and the effect this work has on student results is important. The findings in this study indicate that there may be a mismatch.

In summary The authorities present the IDP as an individual formative document, mandatory to all students. In relation to technology, the IDP documents studied are not possible to use in a ‘formative way’ in technology education. The long-cycle profile of the IDP document limits its effectiveness compared to short- and medium-cycle assessments. Time spent on IDP documentation needs to be correlated to the effects on student results.

Final remarks

The findings in this study raise questions regarding the validity of assessments in technology at every level: national policy/municipality/school, on both the teacher level (collective assessment in technology) and school level (the responsibility to ensure and secure teachers’ assessing competence). From this I argue for more research and support for teachers concerning assessment in general and in technology in particular. There must be an improvement in order to move every student forward in technology.

I wish for a future sober discussion regarding teachers’ formal and informal assessments. Preferably all efforts should not be laid on the least effective cycle of formative assessment, in which IDP-documenting is included. More interestingly, put the effort on the short cycle of formative assessment. How can it be enacted and supported in different teaching situations to effectively support learning in technology? The possibilities regarding how to draw conclusions from the classroom (nods and glimpses of understanding) need to be discussed with other teachers. In theory IDP could be such a system but then it has to be possible to be used to identify the individual student and support further progress regarding where to next along the student’s learning journey.

This article will end with a note from a former ninth grader in a reader’s column in one of the largest morning papers in Sweden (Svensson 2012). She concludes from her experience on receiving written assessments in compulsory school that most of her written assessments were either unnecessary or said nothing at all. Instead she advises the teacher to think about how to write so that the student will have the notion of learning from one’s mistakes, which she had experienced positively in one subject. She argues for the teachers to make sure that the student understands what is said in the written assessments and thus is actually helped by the information, which is the intention of the document. Her point of view is in unison with previously presented research concerning the use of assessment for learning to stimulate progress.