Introduction

This Special Edition of the journal is dedicated to the story of project e-scape that was run in the Technology Education Research Unit (TERU) at Goldsmiths University of London from 2004 to 2010. Since e-scape is an approach to the creation and assessment of learner-generated web-portfolios, it is a story that may appear—at first glance—to be dominated by technology. But it is not. We have—for sure—developed and used a lot of technology, and reviewed much more, but more often than not these technologies were rejected. And the principal reason for rejection was that they were (for one reason or another) not sympathetic to learners’ and teachers’ priorities in the classroom/workshop/studio. The story might also appear to be dominated by assessment, and it is fair to say that assessment concerns have been high on the agenda. But neither the technology, nor the demands of assessment, have been at the top of our agenda.

Our overriding ambition throughout the 7 years of e-scape—through three developmental phases—has been to enrich learning and teaching approaches.

The e-scape story may also appear to be dominated by the subject domain of Design & Technology. But again this would be to miss the point. It is true that TERU has grown from those roots and all TERU personnel were once teachers in that domain. But one of the delights of undertaking research and development within a D&T framework is that we can get away with things that would not be countenanced in more ‘serious’ domains. The radical approaches that we have sometimes developed (see for example article by Pollitt) would never have emanated from a maths research unit, or a science research unit or a language research unit. But in our world we have a little more scope for radical action—and we have more elbow-room to try things out with schools. And having shown that they can work—and that they can dramatically enhance existing practice—the ‘serious’ disciplines can adopt such initiatives with much less risk. So now e-scape projects run in science, foreign languages, social sciences, English language, geography, physical education studies, mathematics and many more disciplines. We are delighted with this widespread adoption and the potential benefits it brings to learners and teachers.

Starting points

In 2003, the Technology Education Research Unit (TERU) at Goldsmiths University of London was asked by the Qualifications and Curriculum Authority (QCA) in the UK to undertake research to examine the extent to which—and the ways in which—innovation and team-work might be more fully recognised and rewarded in assessment processes, particularly within the General Certificate of Secondary Education (GCSE) at 16+ (Year 11). The project was commissioned within the curriculum setting of Design and Technology (D&T) and the project ‘Assessing Design Innovation’ was launched in January 2003 and concluded in December 2004.

The principal outcome of that project was a paper-based portfolio assessment system for learners’ performance in D&T tasks. The approach that we developed sat somewhere between a formal timed examination and the more individual arrangements that typically apply to the assessment of student coursework. The model of structured activity that we developed was designed to operate in 6 h—typically 2 mornings—and presented learners with a design task that was to be taken through to a prototype. The outcomes of learners’ work during that project were most encouraging. It proved possible to demonstrate that different levels of innovation were identifiable in the work and that the best work was highly innovative. Critically, the consensus of teachers and learners was that the portfolio system acted as a dynamic force to drive the activity forward with pace and purpose. The data from the trials of the system was fully reported in the project report (Kimbell et al. 2004).

Alongside this development, it is important to note a number of parallel strands of influence that combined to create project e-scape.

Assessment for learning had become a major concern of educators. It placed the teacher (rather than any external body) at the heart of the assessment process and presented them with large amounts of personalised learning information to manage (See for example Black and William 2003). Within this emerging field, we saw much value in exploring the use of digital systems to support teachers and learners.

In this digital context, e-learning is a term that had emerged to describe a wide range of digitally enhanced educational experiences. The England and Wales Department for Education and Skills (DfES) e-learning strategy identified the provision of a centralised e-portfolio as an important priority for reform, second only to the provision of the infrastructure to make it work. And the strategy made clear that this portfolio system was to be seen as altogether more than merely a repository for learners’ work.

…We will encourage every institution to offer a personal online learning space to store coursework, course resources, results, and achievements. We will work towards developing a personal identifier for each learner, so that education organisations can support an individual’s progression more effectively. Together, these facilities will become an electronic portfolio, making it simpler for learners to build their record of achievement throughout their lifelong learning. (Department for Education and Skills (DFES) 2005)

In the context of D&T in the UK alone, Awarding Bodies were responsible for the assessment of approximately half a million students annually using portfolios in which learners developed a design solution to a task of their own choosing, simultaneously telling the story of their development process. With most awarding bodies, a major block of marks (often 50% of learners’ GCSE marks) were allocated on the basis of the quality of these portfolios. The Awarding Bodies responsible for these assessments—particularly at GCSE—were increasingly seeking to exploit the power of digital technologies.

This combination of influences led us in TERU at Goldsmiths to develop a proposal to QCA/DfES for a digital approach to portfolio assessment. Learning activities in D&T studios and workshops were increasingly influenced by digital technology, and we believed that the portfolio assessment system that we had developed in the DfES “Assessing Design Innovation” project provided a useful model to explore the possibilities of extending digital working in D&T into digital assessment of learners’ performance. Project e-scape (e-solutions for creative assessment in portfolio environments) was established as a result of discussions with DfES, QCA and Awarding Bodies.

Clarifying terms … “portfolio”

The concept of a ‘portfolio’ was (and probably still is) problematic … meaning very different things to different people. The potential for different interpretations is increased by the use of portfolios as an assessment tool, and complicated yet further in the context of e-learning, where ‘e-portfolio assessment’ has become a minefield of misunderstanding and confusion.

As a starting point, we recognised that there are many purposes to which portfolios might be applied. Many individuals and organisations have attempted to analyse these multiple visions, e.g., IMS Global Learning (developing specifications for e-learning environments) and Mason R, Pegler C, Weller (developing assessment tools for on-line courses) and reported in the British Journal of Educational Technology.

  • assessment portfolios

  • presentation portfolios

  • learning portfolios

  • personal development portfolios

  • multiple owner project portfolios

  • working portfolios

(Mason et al. 2004 p. 720)

For the purposes of e-scape we believed it would be helpful to clarify our understanding of what a portfolio is and how it works in D&T. Whilst these portfolios have been refined over many years and attuned in particular to the priorities of assessment, nonetheless, the essence of a D&T portfolio involves a mix of what the IMS lists as an assessment portfolio, a learning portfolio and a working portfolio.

Through custom and practice in D&T it is possible to observe several forms of what a portfolio might be.

  1. 1.

    The most common meanings of ‘portfolio’ defines it as something akin to a box-file into which the learner (or perhaps the learner’s teacher) can place work to demonstrate that certain operations, or skills, or processes have been experienced. Viewed in assessment terms, the learner’s portfolio becomes a collection of evidence that is then judged against some rubric to arrive at a mark or a level. A portfolio of this kind is conceived as little more than a container for evidence.

Translated into the e-portfolio world, it is possible to conceive of many ways in which the evidence being ‘contained’ could be enhanced through the application of database or spreadsheet systems, which might even be designed to automate the process of containment, standardising, streamlining and potentially removing the need for human interaction.

  1. 2.

    A somewhat more sophisticated view of portfolio arises from process-rich areas of the curriculum, where teachers encourage students to document the story of a developing project or experience. This results in learners reporting what they have done at various points in the process.

In this kind of ‘presenting’ or ‘reporting’ e-portfolio, it is not unusual for students to use linear digital presentation technologies—e.g powerpoint—to give a blow by blow account of where they have been in the project—and how they finally got to the end.

However, whilst these two accounts might be seen as part of the picture, neither of them captures the dynamic-capability dimension that informs our view of a D&T portfolio.

The central problem—in both cases—is that the portfolio construction is conceived as a second-hand activity. First you do the activity—whatever it is—and then afterwards you construct a portfolio that somehow documents it. The portfolio is a backward-looking reflection on the experience.

  1. 3.

    A third and far richer view of the concept of the portfolio is evidenced in schools where teachers have embraced the challenge of linking learning and working concepts of the portfolio to the more commonplace assessment portfolio.

In this rich form, the portfolio is transformed into an entity that is integrated into and grows dynamically with the project and in the process it shapes and pushes forward the project. The best analogy is neither a container nor a reported story, but is rather a dialogue. The designer/learner is having a conversation with him/herself through the medium of the portfolio. So it has ideas that pop up but may appear to go nowhere and it has good ideas that emerge from somewhere and grow into part solutions and it has thoughts arising from others’ comments and reflections on the ideas. Any of these thoughts and ideas may arise from procedural prompts that are deliberately located in the activity to lubricate the dialogue. Looking in on this form of portfolio is closer to looking inside the head of the learner, revealing more of what they are thinking and feeling, and witnessing the live real-time struggle to resolve the issues that surround and make up the task. Importantly, this dynamic version of the portfolio does not place an unreal post-activity burden on learners to reconstruct a sanitised account of the process. Creative learners are particularly resistant to what they see as such unnecessary and unconnected tasks, and this significantly accounts for their underperformance in portfolio assessments that demand such post hoc story telling.

But real-time dynamic portfolios are not tidy, nor is it possible to present them in a pre-determined powerpoint template. It is more like a designers sketchbook, full of notes and jottings, sketches, ideas, thoughts, images, recordings and clippings. These manifestations are not random, but are tuned to the challenge of resolving the task in hand. And the point of the portfolio is that the process of working on it shapes and develops the activity and the emerging solution.

Our three categories of portfolio are somewhat dissimilar to those identified by Ridgway, McCusker and Pead (for Nesta Futurelab) in their literature review of e-portfolios.

There are three distinct uses for portfolios:

  • The first is to provide a repository for student work;

  • the second is to provide a stimulus for reflective activity—which might involve reflection by the student, and critical and creative input from peers and tutors;

  • the third is as showcase, which might be selected by the student to represent their ‘best work’ (as in an artist’s portfolio) or to show that the student has satisfied some externally defined criteria, as in some teacher accreditation systems (e.g., Schulman 1998).

(Nesta-Futurelab 2005)

Whilst their first category is the same as ours, their third seems to be little more than an extension of this, allowing for the repository to contain work selected over time and used (inter alia) for assessment purposes. It is a container with some display potential. Furthermore, whilst their second category contains some elements of dialogue potential, it does not capture the dynamic creative essence of portfolios as we see them operating in D&T.

These disagreements demonstrate the thorny territory that is conjured-up merely by the use of the term e-portfolio. We were very conscious of these issues and it demonstrated the absolute necessity of being very clear about what we proposed within project e-scape.

Timeless portfolios and assessment distortion

The tradition of coursework portfolios in D&T evolved over decades and particularly from the inception of GCSE assessment in 1985. In these assessment portfolios, learners’ final projects were seen to be fully documented as process-rich blow-by-blow accounts of their journey from brief to evaluation of final prototype. But as (in the 1990s) teachers and schools became increasingly accountable for students’ performance, and as school league tables and Ofsted reports exposed this accountability in the national press, teachers inevitably found ways to assure that their students created bullet-proof portfolios. Portfolios were increasingly ‘managed’ by teachers to ensure good performance and as this tendency became the norm, the assessments became increasingly invalid. Performance scores became as much about the quality-control exerted by teachers as they were about the designing capability of students. Ive HMI, the Ofsted Subject Adviser for Design and Technology was warning against this from the mid 1990s, and addressing the 2001 National Conference of NAAIDT (the National Association of Advisers and Inspectors of Design & Technology) drew attention to the extent to which ‘the folder’ (i.e., portfolio) had become a pretty product in its own right rather than being merely the vehicle through which the story of the project is told. He specifically castigated the tendency towards ‘hoop-jumping’, ‘death by borders’ and the proliferation of ‘neat nonsense’ used to pad out the folders (Ive 2001). In short, there was increasing concern (Drabble 2000) that the best GCSE grades were not going to the best design students, but rather to the students whose performance was most rigorously managed by knowledgeable teachers and to those students who had the patience (and dogged persistence) to follow such guidance.

In TERU we were very aware of this issue, and of the fact that the problem is created by the timeless nature of the portfolio. Major design projects would typically run for 6 months or more, with endless time for tinkering with the portfolio. By contrast, based on our APU experience (Kimbell et al. 1991) we had for some years been developing an approach to portfolios that we called ‘unpickled’ (see Stables and Kimbell 2000) in the sense that performance was tracked in real - time without the normal benefit of hours and weeks of reflection time (pickling) in which to pretty-up the portfolio. In our un-pickled versions, learners’ portfolios emerge as the raw, real-time ‘trace-left-behind’ by their activities. The transition from timeless portfolios to real-time portfolios is significant in several ways, but centrally the issue concerns direct emergent evidence of process-centred capability.

Real-time approaches for the collection of performance data have been used for some years in the worlds of management and computer science, where system evaluation (and indeed assessment) has entailed the forensic analysis (step-by-step/minute-by-minute) of sub-elements of the performance of systems (e.g., Calvez et al. 1993; Mohd Adnan 1993). And in the learning context, using interactive computer systems, Hillary (2005) has developed a real-time interactive tool for researchers to track and examine the activity patterns of users. It seemed to us [as early as the APU work of 1985 (Kimbell et al. 1991)] that since we claim that process (i.e. designing) performance is central to what we do and value, we should develop exactly this forensic approach to the creation of design portfolios.

‘Peripheral’ digital technologies

One of the problems surrounding the use of digital technologies in learning is that schools’ provision of the hardware tended (in 2004) towards the assumption that this needs to take place in a computer suite, rich in desktop or laptop machines, ruled over by an uber-cyber-dragon and where learners work with a keyboard and screen. Our starting point was very different.

We started from assumptions about the nature of D&T, the context of which is almost always workshops and studios. Two of the constants of these typical spaces are that

  • they are full of materials, apparatus, machinery

  • they are associated with the detritus (mess) of manufacturing

They therefore make challenging locations for computers, keyboards and screens. There is never enough space; the space is not clean (glue, paint, flour & water, sawdust) and learners themselves get oily or painty or gluey or floury fingers that are not then ideally suited to keyboard use.

For all these reasons (right from the outset of e-scape) we did not believe that digital enhancement of the designing activity would involve computers, keyboards and screens. At least we did not believe that these tools would be at the leading edge of activity. Rather we thought that peripheral, back-pocket technologies would be more appropriate: mini digital cameras, digital pens, digital PDAs, mobile phones.

At least at the ‘input’ level, these technologies enable activities in workshops and studios to go ahead almost as normal. They don’t take up too much space and (because they can be pocketed) they are not too sensitive to the clutter of the working space. Our trials in schools were designed to establish how realistic this position-of-principle was.

Interestingly, most students aged 11–16 now have access to mobile phones, a significant proportion of which have digital cameras as a built-in feature. We were aware that as telecom companies raced to differentiate their systems through enhanced features, the distinction between handheld PDAs and mobile handsets would disappear as the two previously unconnected technology strands merged. While ‘smart’ phones, with all the features of a PDA, are currently not marketed to pupils, camera phones are becoming ubiquitous and other ‘smart’ features are increasingly working their way into phones for children. This trend will be all the quicker if it is seen (or marketed) as providing valuable tools for learning, thereby justifying additional parental expenditure.

In short, we were witnessing the growth of third generation computing. Mainframe computer technologies of the 1960’s and 1970’s gradually faded with the emergence of second generation ‘desktop’ computers. These completely transformed our working relationship with computers, providing us with far greater interactivity, apparently unmediated by the programmers whose services had formerly been essential. We could ‘drive’ our own second generation computers in the 1980s and 1990s. As the technologies shrank, the growth of laptop computers particularly in the final decade of the twentieth century did not materially change our relationship to computers. They operated merely as slightly (very slightly) more mobile versions of the desktop. But the new third generation of computers is radically different. They are far more mobile, are equally powerful, and can now genuinely be regarded as ‘back-pocket’ computers. As such, they are in the process of transforming, once again, our working relationship with computers. In the contexts of learning, teaching, curriculum and schools, these transformations are profound.

The brief for project e-scape

The brief that launched project e-scape was drafted by the Qualifications and Curriculum Authority in the following form.

QCA intends now to initiate the development of an innovative portfolio-based (or extended task) approach to assessing Design and Technology at GCSE. This will use digital technology extensively, both to capture the student’s work and for grading purposes. The purpose of Phase I is to evaluate the feasibility of the approach…

Phase 1 of the project (November 2004–June 2005) was, in several senses a “proof of concept” phase, to explore the feasibility of the concept. This proof of concept operated at four levels:

  1. 1.

    Technological

Concerning the extent to which existing technologies can be adapted for assessment purposes within the portfolio system as currently designed for the DfES “Assessing Design Innovation” project. This will include the applicability of other international work in this area and of any relevant system standards.

  1. 2.

    Pedagogic

Concerning the extent to which the use (for assessment purposes) of such a system can support and enrich the learning experience of d&t

  1. 3.

    Manageable

Concerning issues of making such assessments do-able in ‘normal’ d&t classrooms/studios/workshops

  • the training/CPD implications for teachers and schools

  • the scalability of the system (including security issues) for national implementation

  1. 4.

    Functional

Concerning the factors that an assessment system based on such technologies needs to address;

  • the reliability & validity of assessments in this form

  • the comparability of data from such e-assessments in d&t with non e-assessments

Each of these four ‘proof of concept’ deliverables was explored in schools through a series of small-scale trials. A complete project report covering the four ‘proof of concept’ factors was the required ‘deliverable’ for phase 1 of the e-scape project.

Establishing project e-scape

The work for project e-scape was divided—broadly—into two areas of concern. The first was with the ways in which digital technologies might be used to support learners’ designing. This was the priority concern of the research team at the outset, since we were determined to ensure that any digital systems introduced into the designing activity should operate as an enhancement to the activity, rather than as a distraction or a distortion. Accordingly we worked with schools (some of which had been involved in the ‘Assessing Design Innovation’ project) and explored a range of technologies with learners.

The second area of work concerned the technical systems that would need to be in place for the learners to be able to develop their solution to the task in a web-space, accessible to the learners themselves, and their teachers, and (ultimately) to examination board assessors.

We had in mind to start our explorations with a range of ‘peripheral’ digital technologies, typically hand-held, that we might use to enhance the designing activity. And the activity that we were seeking to enhance was the 6 h ‘light fantastic’ activity developed for the assessing design innovation project.

This activity was capable of subdivision into a series of component parts, and for the purposes of exploration with digital peripherals we divided the activity into the following ‘work-parcels’.

  1. 1.

    to support learners’ designing

    • contextualising; task setting; handling collection

(to contextualise and get the activity up-and-running)

  • early ideas

(to express early ideas and enrich them with support from design teams)

  • design-talk

(to allow and record discussion to enrich the designing activity)

  • photo story-line

(to photo [hourly] the evolution of modelling processes)

  • design bot

(to prompt development through questions and task-related information)

  • project genie

(to connect all the above into a coherent interface)

These work-parcels were developed iteratively. Initially we worked with a new technology and sometimes with the supplier of a new technology until we had developed it to the point where we felt it might be useful to support learners’ designing. At that point we arranged a school trial, often just so we could see what happened. We were frequently unsure about what learners would do with the products and systems, and we were continually astonished at their ability to assimilate the new technologies and make purposeful use of them.

The second area of work (to support teachers’ assessment) was also developed into a series of work-parcels.

  1. 2.

    to support teachers’ assessment

    • collect & compile files

(to bring together files from different hard/software systems)

  • data transfer and web access

(to make them accessible in a web-space)

  • present and share for assessment

(to present them as a coherent portfolio output for sharing/assessing)

The challenge here was somewhat different, and therefore our methodology was different. We did not focus these work-parcels towards school trials, in part because schools were just not equipped with the technology systems to do what needed doing. Our approach therefore was to engage in a series of meetings with leading-edge systems developers and to a lesser extent Awarding Bodies, to discuss the possibilities for developing systems that might be able to achieve what we increasingly saw as necessary.

The centrality of school trials

I include this brief account of our school trials to illustrate their centrality to our evolving approach. We were experimenting, or rather we were inviting learners to experiment with a range of technologies that were reasonably cheaply available at the time, but that were almost entirely foreign to their experience of computing in schools. Our first school trial was in Saltash Community School in Cornwall. We worked with a group of year 12 D&T learners, and the purpose of the trial was to explore the impact of several pieces of technology and associated communications systems;

  • digital pens

  • PDAs

  • IR beaming to printers

  • IR beaming between PDAs

exploring ‘design-talk’

screen sketching on PDAs

Alongside these technological concerns, we were interested to explore the impact of the technology on normal working practices in D&T, particularly on the creative and early exploratory phases of work. We explored ‘design-talk’ using voice recognition software to capture and print-out conversations between learners about their work. This was subsequently taken further in a more focused trial with BAEd students at Goldsmiths.

Over a 6 month period we undertook a number of such trials—with different software and hardware, and with very different learner groups. Despite the 16+ assessment focus of this project (funded by QCA and Awarding Bodies) we included in these trials a series of experiments in primary schools—with year 5 learners.

System development discussions

Quite apart from our school trials of the e-scape approach in the classroom, we also had to undertake a series of discussions with technology-based companies. The focus of these discussions was to explore the systems by which ‘hand held’ technologies in the classroom might be linked (through data transfer) to web-based portfolios and subsequently viewed (remotely) for assessment by Awarding Bodies.

There was (at the time) no system that offered the dynamic integration and presentation features we required for this project. There was however a number of e-portfolio platforms that provided the core data management systems necessary to drive the system we increasingly envisaged. We had a series of discussion with technology providers who we thought might assist us in this task:

The central feature of our requirements in relation to any system that we might adopt was connectivity; the capability to beam data automatically from classroom-based, hand-held technologies into pre-designated web-spaces. We explored several possibilities (e.g., using USB, IR [infra-red], Bluetooth and Wireless systems) and identified our priorities for subsequent development. In the web-spaces we explored a number of presentation options, including morphing, panoramas, zooming (SimpleViewer, Postcard viewer), galleries (Flickr), and albums (i-photo). These connectivity and presentation tools formed parallel work packages that we undertook in discussion with the technology companies.

The outcome of these early ‘proof of concept’ experiments was a body of digital work from learners and evidence from the associated e-assessment trials of that work. DfES, QCA and the Awarding Bodies were persuaded of the concept, and we were invited to take the project to the next stage. We were commissioned to undertake (November 2005–Jan 2007) the necessary research and development to create a working prototype system. The e-scape system was to facilitate the designing activity in the classroom, using peripheral digital tools, and to allow all learners’ work to be tracked and logged in real time in a website for subsequent assessment by Awarding Bodies. The system was to be tested through a national pilot (principally with year 10 learners) in the summer term of 2006.

The details of this development process and the research that underpinned it is fully fleshed out in the papers that follow in this special edition of the journal. The papers are broadly clustered into two parts. In part one (Derrick: Kimbell; and Pollitt) the evolution of the escape system is described and analysed. This includes the technical priorities and functions of the software system (Derrick), the pedagogic and assessment priorities of the TERU research team (Kimbell) and a detailed account of the evolution of the Comparative Judgement assessment process within e-scape (Pollitt).

In part two of this Special Edition there are four papers that explore the development and implications of research projects undertaken in association with the e-scape system and in collaboration with the TERU research team.

Williams describes and analyses the use of e-scape approaches and technology in Western Australia, specifically in the context of an upper-secondary Engineering Studies examination.

Seery and Canty outline the use of e-scape in the University of Limerick with first year undergraduate technology teacher education students.

McLaren explores the application of e-scape in the transition years (primary 7 to secondary 3) in Scotland and specifically in the context of formative assessment for learning within Scotland’s Curriculum for Excellence.

Davies describes a primary science project in which collaborating schools using e-scape work with Bath Spa University to develop assessment tasks for year 6 science and technology.

This Special Edition of the journal concludes with a collaborative piece from the editors reflecting on the future evolution of e-scape.