1 Introduction

Project-based learning (PBL) involves organizing learning around projects in which learners construct knowledge and skills that are new to them (Thomas 2000). PBL (see Helle et al. 2006) is an “active student-centred form of instruction characterized by students’ autonomy, constructive investigations, goal-setting, collaboration, communication and reflection…” (Kokotsaki et al. 2016). Learning is focused around a shared purpose and outcome decided by the learners themselves. The fact that learners are constructing and sharing knowledge and asking and answering questions in an authentic context (Kokotsaki et al.) (as opposed to receiving it after it has been instructor-transmitted) points to an underlying philosophy and conception of learning: that of social constructivism. Learning is intended to be social, authentic, meaningful and learner-centered and controlled. What makes PBL unique compared to other forms of constructivist learning is that it goes beyond the construction and sharing of understandings and knowledge to the co-construction of artefacts. Learners are not merely solving problems together but are working towards a shared goal and the creation of a shared artefact(s). PBL “needs to culminate in an end product” (Berland et al. 2014, p. 268). This is why, in addition to being constructivist, PBL can be termed a constructionist form of learning whereby learners build artefacts that are often public as well as socially relevant and meaningful (Berland et al.). In this regard, constructionism includes constructivism.

The effectiveness of this constructionist form of learning characterized by learner autonomy and self-regulation lies in the capacity of the instructor to effectively “scaffold students’ learning, motivate, support and guide them along the way” (Kokotsaki et al. 2016, p. 272). Instead of teaching, instructing or transmitting knowledge, the instructor scaffolds the complex activity and interactions that occur in PBL. Scaffolding means offering support “for tasks or concepts that the student is initially unable to grasp on his/her own…” (van Rooij 2009). Scaffolding is an approach to formative assessment. Formative assessment is an integral part of PBL (Gulbahar and Tinmaz 2006) and means that assessment occurs during and following PBL (Solomon 2003) or, as Gikandi et al. (2011) recommended, embedded “within the learning process” (p. 2348). In forms of PBL that follow specific steps or phases, scaffolding, formative assessment and feedback given to teams and individual learners can help them to move progressively through each step towards the final goal of the project and co-construction of the associated artefacts.

The challenge for the instructor who is assessing and scaffolding PBL is to be able to monitor, review, scaffold and assess the conversation that is at the heart of PBL. Conversation refers to the communication that has to take place in order for teams to carry out their activity and to achieve their shared purpose. In a face-to-face context, the instructor cannot simultaneously participate or ‘listen in on’ more than one conversation at a time. Nor can the instructor (or the learners) review the conversation after it has taken place. Also missing from this face-to-face context are records of the conversation and discussion that is related to formative feedback/assessment. From the learners’ perspective, feedback given orally, once delivered, is not easily available for review. Furthermore, in a face-to-face context, when feedback is communicated through the instructor’s conversations and questioning with individuals and teams, there may be issues of confidentiality. There may also be inefficiencies in communicating feedback such as the need to repeat a similar message in different contexts.

These limitations related to face-to-face PBL provided the initial motivation to investigate the design and implementation of online PBL and its formative assessment. Gikandi et al. (2011) recommended integrating learning with the online formative assessment (FA). Online FA can take many forms such as ongoing, automated feedback using formative tests (see Mitra and Barua 2015). Yet in a context of PBL or other forms of constructionist learning, online tests are unlikely to provide the type of contextualized feedback required by teams and individuals. What is needed is reliance on the online technology, not for testing, but to capture the rich activity and conversation in PBL While much has been written about PBL, less is known about approaches to conducting online PBL and formative assessment in terms of understanding the role that technology can potentially play in particular contexts. The study reported on in this paper aims to add to the knowledge in this area. The results will be of interest empirically and theoretically in terms of illustrating a case of how online technologies can be harnessed to support constructionist, learner-centered, directed and driven forms of learning in which conversation, discussion and communication are central to the learning activity. More specifically, the case may be of interest to instructors and instructional designers interested in designing and implementing formative assessment and PBL in their own contexts.

2 Purpose, Objectives and Overview

The purpose of this study was to investigate the role that online technologies can play in FA in constructionist, learner-centered and directed learning such as PBL in which conversation, discussion and communication are central to learning. The investigation was conducted as one case involving pre-service education teachers in a media-creation course in a university in Thailand. The course relies on PBL and formative assessment as the approach to its design and delivery. It is normally taught face-to-face. The case reported on in this paper involved online delivery of the course along with online FA.

The specific objectives were as follows:

  1. 1.

    Use technology to design and implement online PBL with formative assessment;

  2. 2.

    Evaluate online learners’ post-implementation PBL knowledge and skills compared with a group learning face-to-face;

  3. 3.

    Identify learners’ perceptions of the convenience, benefits and barriers of the technology;

  4. 4.

    Identify learners’ perceptions of their PBL skills and knowledge after using the technology.

3 Review of the Literature

Assessment is at the core of learning in higher education (Gikandi et al. 2011; Stödberg 2012) and is typically discussed in terms of two types: formative (FA) and summative (SA) (Whitworth and Wright 2015). FA depends on timely and informative feedback (Narciss 2008) and is sometimes referred to as assessment for learning (Bennett 2011) that moves the learner forward (Mor et al. 2010). SA, sometimes referred to as assessment of learning, typically involves final instructional judgments of learners’ learning (Ecclestone 2010) for purposes of validation and accreditation (Gikandi et al. 2011). SA involves demonstrating learning after the process has been completed rather than during the process as in FA (Tempelaar et al. 2018). FA involves “appraisals of students’ performance” to meet intended goals (Spector et al. 2016, p. 59). It has been associated with enhanced learning (Bransford et al. 2000), student achievement (Hattie 2009, 2012), performance (Wiliam et al. 2004), and reasoning (Bulunuz et al. 2016). For a more in-depth overview of FA versus SA see Gikandi et al. (2011).

In spite of these potential affordances, FA presents challenges, not the least of which is the need for timeliness as well as a need to facilitate a process that can be complex and onerous. Spector (2016) argued that providing such “timely and meaningful feedback without making use of advanced technologies is difficult to imagine” (p. 61). This need for timeliness and a growth in online learning has contributed to interest in e-assessment (Soffer et al. 2017). The term refers to “reliance on a complex array of tools of varying capacities” to carry out assessment (Tomas et al. 2015, p. 588) and to “collect and store students’ assessment evidences, grade performance, provide feedback and generate reports” (Koneru 2017, p. 129). E-assessment is also referred to as digital assessment (Wang and Kubincová 2017) or computer-assisted assessment (Tomas et al.). Summative e-assessment has often been limited to automated testing and marking but offers more than these possibilities (Tomas et al. 2015). Examples of summative approaches to e-assessment include electronic “forced-choice measures of multiple choice tests, short answer, fill-in-the-blanks, true–false and matching” (Guàrdia et al. 2017). E-assessment that is formative can rely on “very diverse applications from electronic submissions of coursework to the use of wikis or e-portfolios” (Tomas et al., p. 589). In a broad sense, formative e-assessment involves using technology to support an “iterative process of gathering and analyzing information about student learning…in a way that allows the teacher or student to adjust the learning trajectory” (Mor et al. 2010, p. 200). Formative e-assessment has been associated with educational reform efforts due to its potential to “support significant changes” in higher education learning (Pachler et al. 2010), for example, in terms of control of learning by the learners themselves, as well as in relation to ease of administration, time flexibility, and improved accessibility (Whitelock 2007, p. 2).

E-assessment can take place “within learning online and blended settings where the teacher and learners are separated by time and/or space and where a substantial proportion of learning/teaching activities are conducted through web-based ICT” (Gikandi et al. 2011). Vonderwell et al. (2007) posited that the asynchronous nature of online learning changes the assessment because learners can “rethink and assess their own understanding of content” (p. 319). This is because the asynchronous nature means that transcripts and compilations of conversations and discussions can be reviewed and reflected on by learners and assessed by the instructor. Gaytan and McEwen (2007) cautioned against ignoring “the assessment value of e-mail messages, chat room conversations, and discussion board postings” (p. 129) because of the potential insights they can give into learners’ understandings.

In an extensive review of the literature on online FA, Gikandi et al. (2011, p. 2334) found that, if integrated effectively online, FA can potentially support sustained meaningful collaborative interactions among learners and the teacher. The authors also found that online FA involves ongoing learner support, scaffolding and monitoring of learning designed to help learners to engage productively and in deep learning and develop “self-regulated learning dispositions,” and responsibility. As the authors explained, however, in spite of the potential affordances and value of online FA, the emphasis tends to be on summative assessment. For this reason, Gikandi et al. joined others such as Pachler et al. (2010) to argue in favor of focused attention on online FA and “learner and assessment centered learning environments.” Gikandi et al. also drew attention to the need to attend carefully to “the design of these environments” and to make “systematic utilization of a variety of online tools such as online discussions, group interactions, emails and online chats…” (p. 2342).

This design of online FA can be supported by web-based learning management systems (LMSs). LMSs can provide technological solutions for FA (Smart et al. 2015). They are available commercially as well as open source, e.g., Moodle (Koneru 2017). They can also be built in-house assuming the institution has the human resources available to do so. When learners use an LMS, the contents of their communication and aspects of their activity (e.g., file sharing) become ‘visible’ because their conversational activity leaves a trail/trace in the form of transcripts or records. These transcripts and records can potentially serve many purposes in the FA process. For example, in online PBL, learners can review and reflect on transcripts of conversation to identify evidence of their engagement in PBL processes as part of the FA. In addition, LMSs have the power to “harvest data not just on what students learn but also on students’ every learning activity” (Prineas and Cini 2011, p. 4). That type of data can potentially be used to triangulate evidence from conversation.

Learners can also make use of technologies such as e-portfolios or “purposeful aggregation[s] of digital items” (JISC 2008) in order to play a “greater role” in learning (Cambridge et al. 2009, pp. 195–196). E-portfolio-based assessments have grown in popularity with the emergence of networked computing and the Internet and facilitate a move away from “traditional assessment practices” (Morales et al. 2015, p. 1737). E-portfolios facilitate the logistical aspects of assessment and overview of information collected (Van der Vleuten et al. 2015). They also overcome the limitations of the paper-based portfolios which consume “too much time and manpower” related to their management (Chang et al. 2014). E-portfolios, or digital portfolios (Brown 2015), can be easily setup by instructors and instructional designers. They can be designed differently according to the context (Ritzhaupt and Singh 2006). However, it is important to note that e-portfolios can be used for formative or summative assessment.

Figure 1 presents a summary of the concepts reviewed in this section and related to online formative assessment.

Fig. 1
figure 1

Summary of the concepts related to online PBL and formative assessment

4 Methods

4.1 The Case

The particular case selected was that of a media-creation course for undergraduate pre-service teachers of English as a foreign language at a university in Thailand. The course relies on PBL with instructor as facilitator and formative assessor. In the course, students work in teams to co-create learning resources (artefacts) such as a video to teach learners vocabulary about a particular topic based on the project topic. The course is normally delivered face-to-face, on campus, in classrooms with the PBL teams collaborating and co-located in time and place. The course schedule is outlined in Table 1.

Table 1 The media-creation course schedule using PBL

Learners have extensive responsibility for managing their own projects. This means that they must take responsibility and develop metacognitive skills to learn about their learning. They must reflect on their learning and systematically monitor and review their PBL activity. At the end of each step, they are required to submit evidence of their engagement in PBL. To organize their evidence of effective engagement at each step, learners prepare a portfolio. The portfolio is organized around five folders for each of the five steps. Within each folder at the relevant week, the learners submit a description with a piece of evidence to illustrate how they engaged in PBL specifically in relation to the indicators in the framework presented in Table 2. The limitations of this approach and the difficulty associated with capturing the conversation served as the motivation to conduct the investigation. To conduct the investigation, one section of the course was given the option of completing the course online versus face-to-face with online FA. Both sections relied on the same framework and approach briefly summarized in Table 2. PBL approaches and frameworks can vary and are often adapted as is the case in this study for the particular context. Typically, PBL uses a process–product approach (Stoller 1997). The PBL framework used here has five phases/steps. Others may have three such as project launch, guided inquiry and project/problem conclusion (see English and Kitsantas 2013).

Table 2 PBL framework

4.2 Objective 1: Design and Implementation of the Online PBL with FA

4.2.1 Participants

Those involved in the design and creation were a programmer for PHP, MSQL, MatLab and the principal investigator (PI) with a background in computer science and a masters’ degree in Information Technology. The PI was also an instructor in a Faculty of Education in a Thai university.

Participants for the implementation were 30 learners assigned to an online section and 30 to a face-to-face section of the course. All were Bachelor of Education, pre-service teachers specializing in English as a foreign language and enrolled in the media-creation course. Initially, there were 30 learners in this section, but two learners dropped the course. The actual numbers were, therefore, 30 in the face-to-face section and 28 in the online section.

4.2.2 Procedures

The design and creation relied on an iterative process involving five steps as follows:

  1. 1.

    Identify the target context to determine who will use the system and why.

  2. 2.

    Determine what the system will include, how the interface will appear, and how data will be accessed.

  3. 3.

    Determine the format of content included.

  4. 4.

    Determine the platform and how data will be uploaded, downloaded, saved and presented.

  5. 5.

    Determine how the system will detect student data and identify the algorithm of the programming language.

The implementation was led by the PI and conducted in the context of a media-creation course. It lasted for 16 weeks and was entirely online. The implementation is described in more detail in the results.

4.3 Objective 2: Evaluate Online Learners’ Post-implementation PBL Knowledge and Skills Compared with a Group Learning Face-to-Face

4.3.1 Procedures

At the end of the implementation, the online learners (n = 28) and the face-to-face learners (n = 30) were given an electronic post-test of PBL knowledge and skill. Post-test items were related not to their own evidence of engagement in PBL but their understanding of PBL knowledge and skills. The learners were given 30 min to complete the post-test.

4.3.2 Instrument

The instrument items followed those listed in Table 2 within the five categories. The instrument had 27 items related to the PBL framework. As with all instruments used in the study, the language of presentation was Thai.

4.3.3 Data Analysis

Data were analyzed using mean, standard deviation and t-tests conducted using SPSS. The t-tests were used to measure significance in differences between the means in two groups.

4.4 Objective 3: Identify Learners’ Perceptions of the Convenience, Benefits and Barriers of the Technology

4.4.1 Procedures

Learners’ perceptions of the online PBL and FA system were collected through a survey administered in the classroom. Prior to administration of the survey, the researcher explained to learners that there were no correct or incorrect answers and that completion would have no effect on grades. The researcher read and explained each item and answered any questions learners may have had. The learners were given 30 min to complete the survey. Responses were anonymous.

4.4.2 Instrument

The survey used a five-point Likert scale with a total of 23 items. The survey was separated into two sections. The first section focused on convenience (10 items). The second focused on benefits (eight items) and barriers (five items) of online PBL and FA. The section on convenience used a scale with the following choices: 5—very convenient and easy, 4—somewhat convenient and easy, 3—don’t know, 2—somewhat inconvenient and difficult, 1—very inconvenient and difficult. The second section on benefits and barriers used the following scale: 5—Strongly agree, 4—Agree, 3—Don’t know, 2—Somewhat disagree, 1—Disagree. The final draft was established following several modifications from five individuals with expertise in educational technology, media creation, and/or PBL. The Cronbach alpha values of the survey of learners’ perceptions of the convenience, benefits and barriers of the online PBL with FA had high reliability with values greater than 0.7, revealing that the questionnaire had a high degree of internal consistency.

4.4.3 Data Analysis

Data were analyzed using descriptive statistics such as frequency, percentage, mean and standard deviation.

4.5 Objective 4: Identify Learners’ Post-implementation Perceptions of their PBL Skills and Knowledge

4.5.1 Procedures

At the end of the 16-week implementation, a survey was administered in the classroom. Prior to administration of the survey, the researcher showed learners how to complete the survey and explained that there were no correct or incorrect answers and that completion would have no effect on grades. She read and explained each item and answered any questions learners may have had. The learners were given 20 min to complete the survey.

4.5.2 Instruments

The survey had 27 items, listed using a five-point Likert-style checklist with one possible choice: 5—Strongly agree, 4—Agree, 3—Don’t know, 2—Disagree, 1—Strongly disagree. The survey had 27 items. The items corresponded to the five steps of PBL as outlined in Table 2.

4.5.3 Data Analysis

Data were analyzed using descriptive statistics such as frequency, percentage, mean and standard deviation.

5 Results

5.1 Objective 1: Design

A learning management system (LMS) hosted on the university’s servers was constructed in-house using open-source tools such as a general-purpose scripting language (PHP) along with an open-source database management system (MySQL). The LMS included an instructor interface as well as a learner interface. Communication tools included a discussion forum and CHAT rooms. Both learners and the instructor had access to these tools. The instructor could read and potentially comment on or respond to any asynchronous discussion and synchronous CHAT conversation. The discussion forum was pre-structured with five main topics corresponding to the five PBL steps. Communication tools also included an internal email system. The instructor could not view learner-to-learner emails but could view and participate in all discussions and CHAT sessions. The instructor could also have a compilation for each learner of all their activity in the discussion and CHAT for each PBL step. The system also provided an activity report for each learner. This report represented a log of activity in quantitative format such as number of logins, number of posts and replies to posts. The e-portfolio consisted of a simple internal file ‘dropbox’ with five folders corresponding to the five PBL steps. These folders could accommodate files in multiple formats including large video files representing, for example, a draft of a learning resource. Figures 2 and 3 show the interface and tools for the system.

Fig. 2
figure 2

Student’s interface

Fig. 3
figure 3

Instructor’s interface

5.2 Objective 1: Implementation

Implementation followed the schedule as outlined in Table 1. Learning was driven by learners interacting and working together and progressing through the PBL steps with shared purposes and goals. These purposes related to creating learning resources (final artefacts) that, as pre-service teachers, they might subsequently use in contexts of teaching and learning. Learners could use a discussion forum or a CHAT tool to communicate, interact, plan and share ideas as well as files. Near the end of each step in the PBL process, learners were expected to compile and submit physical and reflective evidence of their engagement in each of the indicated associated behaviours as outlined in the framework. For each of these, there were five separate folders in which learners could store their evidence according to each of the five PBL steps. This meant that, for the particular step, they could include an artefact such as a discussion post or a short document in addition to a brief (one page) reflection on their engagement and evidence. In addition, the system was preprogrammed to compile information related to each learners’ activity and place into a report accessible to the instructor. The report included quantitative information related to learners’ participation such as number and duration of logins, tools accessed when and for how long, number of discussion and CHAT messages/conversations posted and read, emails sent and received, and content accessed. The instructor could then use this information for each student to validate, triangulate or to complement information provided by the learners in their e-portfolio. The instructor provided feedback using the discussion forum which allowed for both individual (private) conversations as well as whole group or smaller group conversations and exchanges. The group discussion was relevant where the instructor identified feedback that pertained to more than one individual. Learners could then use this feedback to improve their performance and engagement in subsequent PBL steps. The implementation is illustrated in Fig. 4. The figure’s arrows are intended to show that the activity was cyclical and iterative.

Fig. 4
figure 4

Online PBL and formative assessment in a media-creation course

5.3 Results for Objective 2: Learners’ PBL Skills and Knowledge Post-implementation

Table 2 shows the results of a post-test to measure knowledge and skill related to PBL. Learners in both the online section as well as those in the face-to-face (F2F) sections completed the post-test. As the results indicated, learners’ levels of understanding were higher in the more advanced steps of PBL. They were significantly higher for the online section as opposed to the F2F section. For the online section, the lowest level of understanding was in the PBL steps related to the skills of exploring and reviewing (Table 3).

Table 3 Post-implementation comparison between the online and F2F sections

5.4 Results for Objective 3: Learners’ Perceptions of the Convenience, Benefits and Barriers of Online PBL and FA

5.4.1 Convenience

Results revealed that most learners perceived the system as convenient. They identified the login process as the most convenient while the user interface was considered the least convenient. The survey was limited in scope to closed items. Therefore, learners did not have the opportunity to explain why they found the user interface less convenient than other aspects of the system. Entering content into the e-portfolio was among the most convenient items (Table 4).

Table 4 Survey results for convenience

5.4.2 Barriers and Benefits

Table 5 presents the results related to students’ perceptions of the barriers as well as the benefits of using the e-assessment system. The results revealed that the top-ranked items were 1, 4, and 2: Using the system with PBL increases my knowledge and skills; assessment in the system is better than traditional assessment and; the system is an excellent learning resource. The lowest ranked were the negative items that referred to barriers. The lowest ranked were technical problems, ICT literacy barriers, lack of time, and PBL feedback.

Table 5 Barriers and benefits of using system with PBL

5.5 Objective 4: Learners’ Post-implementation Perceptions of Their PBL Knowledge and Skills

Table 6 presents the results of the online learners’ perceptions of their PBL skills after learning online.

Table 6 Results of learners’ perceptions of PBL skills after implementation

Students ranked highly being able to make observations and think creatively. This means that they perceived themselves as being more able to engage in these behaviors following their participation. In the Review step, they ranked highest collecting data and compiling information. Under the Select step, they ranked highest being able to assess and make decisions as well as being able to engage in metacognitive thinking. Under the step of Produce, they reported being able to better problem-solve and complete any technical work required. Finally, under the last step of Present, they ranked most highly, being able to create a final project artefact as well as plan strategically. The lowest ranked items were being able to make and present project decisions and work in a team in the initial PBL step of Exploring. However, the highest ranked item also pertained to assessing and making decisions in the Select step. Working in a team ranked lower, on average across the different steps. Also ranked low were sample and collect data and ask questions and make assumptions under the Produce step. In general, the lower ranked items were in the first PBL step related to Exploring, in particular, working in a team and making and presenting project decisions.

6 Discussion

This study has illustrated one case of how online technology can support complex forms of constructivist/constructionist learning such as PBL that depend on FA and extensive communication and interaction. Specifically, the online technology provided the following affordances or capabilities:

  1. 1.

    An LMS for organizing and managing learning;

  2. 2.

    communication tools to support conversation, interaction, collaboration, sharing, construction and creation;

  3. 3.

    transcripts of this communication for learners to use for self-monitoring and reflection and for evidence in their e-portfolios;

  4. 4.

    communication tools (same as in item 2) to support text-based conversation, interaction and communication between the instructor and learners for purposes of ongoing, timely FA and feedback;

  5. 5.

    transcripts of this feedback for learners to reflect on and include as part of evidence;

  6. 6.

    additional quantitative information (from system logs/report) pertaining to learners’ participation for use by instructors to triangulate learners’ evidence;

  7. 7.

    file storage and sharing capability for learners (including e-portfolio) and instructor.

In this case, the assessment and learning were intertwined and took the form of learners interacting, collaborating, constructing and creating with shared goals and purposes, supported by three main scaffolds: their instructor’s formative assessments and feedback; the PBL framework and approach; the LMS and e-portfolios (including the communication and file-sharing tools). The technology served as a foundational scaffold to support, not only the learners, but the instructor, not only the assessment, but the learning. The learning depended on a high degree of scaffolding because it was ill-structured in contrast to a highly structured syllabus designed to deliver a pre-determined body of knowledge. It also depended on extensive communication and interaction which was supported by communication tools. It depended on learners monitoring their learning, reflecting on it in relation to the PBL framework and in relation to previous feedback and subsequently ‘reacting’ as they moved through the framework towards construction and creation of the final artefact. By communicating online, the learners had a record of activity, conversation, decisions, plans, discussions etc. They could use this record to analyze and reflect on their activity and behaviors after they happened. They could also use the record as a source to extract evidence to show how they engaged in PBL. They could share this analysis and evidence using their e-portfolios.

This iterative and cyclical process is outlined graphically in Fig. 5. The FA is fundamentally supported by the technology in the form of a simple LMS with communication and file-sharing tools with e-portfolios for learners and the instructor’s use. The figure aims to show the relationship between all the elements in this context of PBL as interconnected, dependent on and building on each other.

Fig. 5
figure 5

Technology as a foundational scaffold for formative assessment in PBL

Results of the surveys of learners’ perceptions suggested overall learner satisfaction with the form of learning and few barriers. Two learners did drop out but it is not known if the technology played a role or presented a barrier. The results of the PBL knowledge and skills post-test showed higher achievement for the online versus the face-to-face section. These results offer encouragement in relation to Thailand’s efforts to reform education to privilege more learner-centered, meaningful forms of learning. This study has shown that it can be accomplished successfully in an online context with tools for support and scaffolding. The formative nature of the assessment, however, was not due to the technology but to the pedagogical approach to PBL. As Pachler et al. (2009) argued, “it is the learners and teachers as human actors who ultimately determine the formative effects of engaging with technologies” (p. 8). It is not the technology that is formative rather it is up to the instructor and learners to use it formatively as was the case in this study.

This paper has illustrated one case of how the technology might be used formatively and the role that it can play in the service of FA in general and in PBL in particular. In a review of advanced e-assessment techniques (see Ripley 2009), the authors argued that e-assessment tools used to be created by “technically expert staff” but that, increasingly, it is the academics [and instructors] who are independently “implementing their own e-assessment” (p. 2). The authors of the review noted that advanced techniques for e-assessment are those that are “used in isolated or restricted domains, and which have successfully applied technology to create an assessment tool” (p. 2). This study illustrates one such case where technology has been applied for assessment. What makes this study’s case unique is that the context is PBL, learners are controlling their learning and contributing indirectly to their assessment and; conversation and communication are central to the learning and the assessment.

This study relied on tools that are likely to be easily accessible to higher-education instructors. There are other technologies that can play a role in assessment, but they may not be as accessible or as easily adapted to the requirements of formative assessment in PBL. Learning analytics represent one of these technologies. Analytics can potentially “tailor educational opportunities to each student’s level of need and ability” (Johnson et al. 2011, p. 28). They can also provide a “precise description of learner behavior,” deliver an analysis of how the behavior interacts with other constructs, and track “how it grows and changes over time” (Berland et al. 2014). Yet these affordances are more potential than common. In fact, learning analytics remains an emerging field (Khalil et al. 2016; Mah 2016) of interest more to researchers than to practitioners and for which “the role of the pedagogical context has not been fully understood” (Nguyen et al. 2017, p. 1). Learning analytics also raise issues of ethics, privacy, and data security and ownership (Khalil et al.) and “come with costs” (Bienkowski et al. 2012, p. viii). In the context of this study, learning analytics are a reminder of the need for technological solutions that are affordable, adaptable, useable, feasible and scalable for institutions and instructors. They are also a reminder of the need for and lack of tools to support more formative approaches to assessment. These tools should ideally be able to mesh with existing aspects of the pedagogy. In this study, the technological solutions functioned in conjunction with the PBL framework and approach.

Online PBL with FA, therefore, depends on technology that can be at the service of learning activity, and that can support and scaffold it. This is a very different role than might play learning analytics. Formative analytics (see Sharples et al. 2016) may be more relevant to this context than learning analytics. Sharples et al. (2016, p. 32) described formative analytics as analytics for rather than of learning.” By for learning is meant that the analytics help the learners “to reflect on what has been learned, what can be improved, which goals can be achieved, and how to move forward” (p. 32). Formative analytics might involve “real-time personalized automated feedback and visualizations of potential learning paths as well as tutoring systems that assess mastery and understanding of key concepts and their inter-relations” and then offer “instant formative feedback” (Sharples et al., p. 32). It is not clear if these affordances of formative analytics would be relevant or useful in a context of FA and PBL. Can it include the conversation as is the case in this study? Does it put the learner at the centre? These questions are meant to highlight the uniqueness and complexity of the learning and assessment in this case and that requires technology to be highly adaptable to the context. In the case of this study, the technology was adopted from other contexts and not expressly designed for formative assessment. The learning borrowed the technology. This is in contrast to use of applications that might be designed specifically for FA but that may lack the capacity to adapt to the specific requirements of the context as the technology did in this case.

7 Conclusions

This study was conducted in a country, Thailand, grappling with the need for educational reform to move learning towards more authentic, learner-centered approaches. What makes the case relevant beyond this narrow context of this one study and country, however, is that it illustrates a response to a broader more universal challenge. This challenge is how to use technology to support constructionist forms of highly learner-centered and directed learning such as PBL that depend on FA and extensive communication and conversation. This study illustrated one example of how this can be done using online technologies such as an LMS and e-portfolios familiar to many contexts of learning in higher education. However, the study showed that while the online technology provides a foundational scaffold, it functions in concert with the instructor’s formative feedback and with the pedagogical scaffolds provided by the PBL approach and framework. This finding confirms the argument that while definitions of online FA always include a reference to some form of technology such as ICTs or computers, as Mor et al. (2010) explained, it is not the technology that is formative, it is how it is used that determines whether it is formative or summative: “No assessment technology is in itself formative, but almost any assessment technology can be used in a formative way” (p. 201). Online FA relies on technological as well as social resources “to engage agentively with evidence of learning in order to effect changes in understanding” (Daly et al. 2010). There is “no definitive amount or use of the technology,” rather the FA can include technology to varying degrees (Daly et al. 2010, p. 634). The assessment must, however, include evidence “about a learner’s state of understanding relative to desirable goals, and where individuals are enabled to take actions which have formative effects” (Daly et al., p. 634). Figure 5 made evident that the technology played a central and foundational role. Yet the figure also showed that it was only one element in the highly complex activity of individuals coming together to co-create.

This study has privileged use of online tools of an LMS and e-portfolio to support communication and feedback in PBL. In other contexts and subsequent investigations, social media might be combined with these tools. However, the addition should be done in such a way that allows the instructor to continue to play a central role of providing FA by following the conversation and communication and responding to it. Similarly, PBL might rely on a blended use of the LMS and e-portfolio combined with social media as well as face-to-face sessions. If this option is chosen, instructors need to be aware that not all of the communication may be available to review after the activity has been completed. Instructors interested in replicating the study presented in this paper should note the need for not only reliance on technology but on a learning framework that supports managing and organizing learning material and structuring of submission of evidence. In addition, they should recognize the central role played by learners in the FA who must provide evidence of their engagement in PBL.

This study was limited to a focus on learners’ perceptions of their experiences. Future studies might investigate the perceptions of instructors, for example, in relation to workload with online FA in PBL. Given the highly contextual nature of PBL which can vary from one learning situation to the next, it will be useful to see other examples or instances of similar investigations in other contexts and countries. Participants in this study were not involved in any form of peer-assessment. However, FA from peers coupled with that from instructors could be built into the PBL. The technology would provide a capacity to monitor and follow the formative peer-assessment. That is a scenario that might be examined in future studies.