Formative Assessment as an Online Instruction Intervention: Student Engagement, Outcomes, and Perceptions

Online education has long been suffering from high dropout rate and low achievement. However, both asynchronous and synchronous online instructions have to become effective to serve as a quick response to maintain undisrupted learning during the COVID-19 outbreak. The purpose of the present study was to examine student engagement, learning outcome, and students’ perceptions of an online course featured with frequent tasks, quizzes, and tests as formative assessment. Data were collected from the first five weeks of a course that was temporarily converted from blended learning to be fully online in time of school closure. Analysis of students’ learning records and scores indicated that students engaged themselves actively in all of the online learning activities and had gained high scores in all tasks, quizzes, and tests. In addition, students held positive perceptions towards the formative assessment.


INTRODUCTION
The year of 2020 has witnessed the worldwide COVID-19 outbreak. For the sake of containing the spread of the global pandemic, governments all around the world had to close their educational institutions (UNESCO, 2020a). As a response to such crisis, educational institutions in every country gradually provided both asynchronous and synchronous online instructions to students enrolled at different levels of education, in an attempt to ensure that learning remains uninterrupted in time of school closure.
To support such unplanned and rapid move to online education, UNESCO (2020b) has launched the Global education coalition, provided technical assistance, selected digital learning resources, etc. to promote inclusive learning opportunities for students. In response to sharply increasing demand on asynchronous and synchronous online instruction tools, many online learning platforms and live streaming providers are offering free access to their services (Li & Lalani, 2020).
Nevertheless, the students and teachers' temporally and spatially separated status in this unprecedented period is subject to the long-term criticism of distance education-it is challenging to monitor and diagnose student learning (Cheng et al., 2013), giving rise to compromised teaching effect. Research has shown that students feel isolated in online education (Hammond, 2009;Vonderwell, 2003;Woods, 2002), resulting in high dropout rate (Carr, 2000;Hodges & Kim, 2010;Rovai, 2002), high rate of boredom and low achievement (Chapman et al., 2010;Fredricks, 2015).
One of the best solutions for such dilemma might be effective pedagogical design that provides students with formative assessment in both asynchronous and synchronous online instruction settings.
Formative assessment, referred to as "assessment for learning", is generally considered as a planned process in which different assessment activities are scheduled to elicit evidence of student learning, leading to teachers' instructional adjustment or students' learning tactic adjustment (Black & Wiliam, 2009;Looney, 2011;Popham, 2008;etc.). If used appropriately, formative assessment could promote student learning and achievement (Black & Wiliam, 1998a;Black & Wiliam, 1998b;Andrade & Heritage, 2018).
Nevertheless, online formative assessment in the aforementioned studies were conducted with a blended learning approach that deliberately combines online and face-to-face instructions (Graham, 2006), little is known on whether such formative assessment could bring about similar teaching effect in fully online instruction.
In fact, research on formative assessment that engages students actively in both of the asynchronous and synchronous online instructions and improves the learning performance as well would shed light upon online teaching as an emergency response and even for regular online education and blended learning in the future. Therefore, the present study would design frequent tasks, quizzes and tests as formative assessment in the whole online instruction process and would collect data to examine students' learning performance, in an attempt to answer the following three questions: 1. Does the formative assessment engage students in the whole online instruction process? 2. Does the formative assessment generate an impact on students' learning outcomes? 3. What are students' perceptions of involvement in the online formative assessment?

THE PROPOSED FORMATIVE ASSESSMENT INTERVENTION IN ONLINE INSTRUCTIONS
Considering the probable benefits of using formative assessment in online instruction and the importance of well-designed tasks and activities for formative assessment (Johnson-Smith, 2014;Ray, 2004), this paper would like to propose a careful use of frequent tasks, quizzes and tests as formative assessment to involve students in the online learning process and facilitate their learning as well (see Figure 1). Figure 1 depicts formative assessment for both asynchronous and synchronous online instructions. Asynchronous online instruction could be delivered via learning management system (LMS) such as Moodle, providing videos, drill and practices, discussions, tests, assignments, etc. Firstly, the drill and practices can help students practice the skills illustrated in the videos; whereas the asynchronous discussions would direct students to reply to threads that invite case analysis, evaluation, solutions, etc. Furthermore, the module test could help students form a holistic picture of the content in the module. Finally, an assignment helps students put the knowledge into practice.
Apart from the asynchronous online instruction, synchronous online instruction could be offered through a combination of live streaming media (i.e., Zoom) and real-time quiz tool (i.e., Kahoot!), creating a virtual classroom at a scheduled time at which students enjoy real-time interaction with their teacher and peers (Alonso et al., 2005). Formative assessment in the live streaming meetings are featured with a sequence of tasks and quizzes: oral/written discussions, evaluation on peers' performance on the discussions, and real-time quizzes.
Benefits for such pedagogical design are twofold. On the one hand, students' engagement could be assured by regularly completing tasks, quizzes and tests during the whole online learning process, improving learning performance. On the other hand, teachers could constantly and regularly get a record of student learning. Based on which, they could provide prompt feedback and make immediate adjustment to follow-up instructional activities, improving teaching effect. Popham (2008) has classified the proportions of formative assessment activities in classrooms to be "No Formative Assessment, Token Formative Assessment, Moderate Formative Assessment, and Near-Total Formative Assessment". Meanwhile, Wiliam & Thompson (2007) have categorized the "cycle times for formative assessment" as three types, namely, short-cycle, medium-cycle, and long-cycle. These three adjustment cycles differ in terms of focus and length. For example, the short cycle has a focus within a single lesson, lasting from five seconds to one hour.
These classifications guide classroom teachers to plan the quantity of formative assessment activities and adjustment cycles within a lesson/unit/semester. For example, a teacher delivering a course that attributes 4 weeks to complete one learning module might want to plan Moderate Formative Assessment in medium cycle.

Converting Previous Blended Learning Course to Be Fully Online
According to previous teaching schedule of the Spring semester of 2020, the first author was delivering a course entitled College English II, a follow-up course of College English I, to a same group of students. This series of course aim at cultivating independent English users as described by the Common European Framework of Reference for Languages (CEFR 1 ). In alignment with the level scale of CEFR, students reach immediate level of English proficiency by attending College English I, and achieve upper immediate level by the end of College English II, getting ready for the course of academic English in the following semester. The course targets first-year college students who learn English as a Foreign language (EFL). It lasts for 16 weeks, requiring 160 minutes of learning hour every week.
It has been delivered in the form of blended learning. According to previous teaching plan, the online instruction would be provided with an open online course entitled Internet+College English on iCourse (www.icourses.cn), a nationwide learning platform. That course is characterized as technology-enhanced language learning, covering five modules: speaking, listening, reading, writing, and translating, respectively. The open online course enrolls students from all over the country, whereas the face-to-face meetings would be provided to students in the first author's university exclusively.
In time of school closure due to the COVID-19 outbreak, the first author converted her previous blended learning course to be fully online, providing asynchronous online instruction via the open online course and replacing previous face-to-face sections with regular live streaming meetings (synchronous online). Hence, the course (College English II) was provided with a combination of asynchronous and synchronous online instructions.

The Synchronous Online Instruction and the Real-Time Tasks/Quizzes
The synchronous online instruction was arranged with CCtalk (https://www.CCtalk.com), a nationwide live streaming platform. Meanwhile, the real-time quizzes and interactions occurred in the live streaming meetings were conducted by utilizing MuClass-an interaction tool affiliated to iCourse, which is specialized for real-time interaction in classroom setting. Every student was required to log in iCourse and MuClass with a same user name, so that his/her learning data on iCourse will be linked to MuClass and would be accessible under the instructor's account.
Note that the MuClass tasks/quizzes were released by the instructor manually and delivered to students' mobile devices right before every task/quiz began, and remediation was not available after class when the instructor closed the access. Such design ensured that students fully participated in the whole live streaming meeting, as they could never anticipate a scored task/quiz that they could not afford to miss.

Participants
Participants were 60 first-year college students (23 males and 37 females; mean age of 19) who were EFL learners. These students had completed College English I in the previous semester. That course was delivered by carefully blending the first author's small private online course on iCourse and face-to-face meetings along with very few real-time quizzes from MuClass (averagely one quiz in an 80-minute meeting).
As a result, these students had experience in navigating through the open online course provided in iCourse and had experience in participating in real-time tasks/quizzes from MuClass; whereas, they had no experience of live streaming meetings. Therefore, a trial live streaming was provided one day before the first formal live streaming. Students were guided to logged in CCtalk, joined the course, then checked out whether they could hear the instructor clearly and whether they could be heard when they spoke at the live streaming.
Students were suggested to use one device (computer or laptop suggested) for live streaming participation and one mobile device for MuClass interaction. A one-question survey in the first live streaming found that 47 students (78.33%) used one personal computer (laptop included) to log in CCtalk for the synchronous online lecture and used one mobile phone to log in MuClass for pop-up quizzes/tasks during the live streaming meetings. Meanwhile, eight students used one mobile phone to navigate through the whole live streaming meeting, shifting from the live streaming window to the quiz interface during the quiz time (see Table 1).

Teaching Procedure
Since February 2020, the first author of the present paper began to offer the course of College English II in the form of both asynchronous and synchronous online instructions to the participants. Teaching schedule were planned based on previously scheduled school timetable. Most importantly, the quantity and frequency of tasks, quizzes and tests as formative assessment were designed carefully (see Figure 2).
Timeline of the online teaching procedure in Figure 2 indicates that a warming-up module and two learning modules (listening and speaking) were completed before March 22 nd .
The warming-up module took up 4 days in which students were invited to familiarized themselves with the online learning platform, the teaching syllabus, the learning objectives and to warm up themselves by conducting self-introduction in discussion forums and complete a test.
Each of the learning modules took up 2 weeks, which began with self-directed asynchronous online learning, followed with a synchronous online learning section in the second week. As is indicated in Figure 2, the main asynchronous online instruction started at February 24 th , while the live streamingbased synchronous online instruction was implemented every other week since March 3 rd . Total time length for asynchronous online self-direct learning might take up no more than 120 minutes every week, while the live streaming-based synchronous online learning is set to be 80 minutes.
It is noteworthy that although formative assessment activities in asynchronous online instruction had changed themes and raised language difficulty level in the Spring semester of 2020, their quantity and requirement were similar to the previous semester. Actual variation lies in the synchronous online

Figure 2. The online instruction procedure and teaching schedule
instruction design in which 4-8 formative assessment activities were arranged within 80 minutes, contrasting with the previous semester in which merely 1-2 formative assessment activities were arranged within 80 minutes.
To take the formative assessment in the speaking module as an example, as for the asynchronous online learning, students were required to watch 15 videos lasted for 34 minutes and 16 seconds. It might take students more than averagely 60 minutes to finish watching, as they might replay or pause the videos once in a while. Meanwhile, they were directed to complete 6 oral exercises (60 minutes suggested) and were invited to make comments or ask questions on the videos by replying to the threads in 10 forums, which is estimated to occupy at least 30 minutes. Apart from that, they had to complete one module test (30 minutes). Last but not least, they were required to complete an oral assignment-a 3-min recording on a prepared speech. This task is estimated to take up approximately 60 minutes, as students would have to spend much time on speech note drafting, editing, then recording and editing, etc. In terms of time length on learning, students would approximately spent 240 minutes in the asynchronous online learning if they finish all of the learning tasks, quizzes and tests as required.
When it comes to the synchronous online learning, students participated in four instructional activities, each taking up 15-20 minutes. During the Speaking live streaming meeting, all students were invited to finish and submit a voting task, three peer-evaluation tasks and a pair work discussion (they set up audio chats and recorded the conversations, then submitted). The three peer-evaluation work included one MuClass-based task to fill in an evaluation form on an exemplar speech and two textual evaluations conducted on the iCourse online forums. In addition, three oral sharing tasks were implemented, in which 12 out of the 60 students were invited to get on the online microphone and speak publicly in the live streaming meeting.

Data Collection and Analysis
Data for the present study came from two sources: students' learning records and questionnaire responses.

Learning Records
Students' learning records in the Spring semester of 2020 were collected from the instructor' account in MuClass. Data were sorted out in terms of engagement and scores (see Table 2).
According to course requirement, the recordings of the prepared speeches were submitted to iCourse before March 17, while the recordings of the online discussions were submitted right after the live streaming meeting on March 17.
Based on a criterion that describes detailed scales on content, organization, and delivery of oral output, two English teachers who have a ten-year experience in College English teaching served as raters to separately listened and gave scores to these two types of recordings.
For the inconsistent scores, the two raters were invited to listen to the recording again and gave a second or third score until discrepancy of scores from the two raters became no more 1 point. After that, the inter-rater reliability test was conducted. Results suggested that there were high Data from 11 participants failed to log in iCourse and MuClass with a same user name in the first two weeks. As a result, complete records for their online learning were not available. Therefore, their learning records were excluded from analysis, hence data analysis for the present study was based on 49 students. Every student was coded in the way of S1, S2, S3, …, S49.
Learning data of these 49 students in the Fall semester of 2019 were then extracted from their learning archive. These data include their records in the online discussion forums, their scores in the in-class oral presentation, and the listening test in the final exam. Note that these students' in-class oral presentations were evaluated on the spot by the teacher (60%) and peers (40%) based on a same rubric as criteria that describes detailed scales on content, organization, and delivery of oral output. Meanwhile, the timed listening test (35 items, each score 1 point) was conducted online in a computer lab under the supervision of two teachers.
Data were then analyzed using statistical procedures with SPSS (25.0) to conduct descriptive analysis and Paired Sample T-test of their engagement and scores, with the records in the Fall semester of 2019 served as pretest scores and that of the Spring semester of 2020 served as posttest scores. Since these students didn't return to the campus before September, 2020 in an attempt to contain the spread of COVID-19, final exam was not held before the publication of this paper. As a remedy, total score of the two timed quizzes (14 items in total, each score 1 point) in the listening live streaming meeting were accounted as the posttest listening score. As the raw total scores of the two listening tests and the raw total numbers of the discussion forums (pretest and posttest) were unequal, students' scores were converted to be standard scores before running the Paired Sample T-test.

Questionnaire Responses
Data on student perceptions of the online learning experience were collected by anonymous questionnaire survey. A questionnaire survey entitled "Questionnaire on Students' Perceptions of the Formative Assessment as Online Instruction Intervention" (contains 12 closed-ended questions) was designed to invite students to evaluate the benefits of the formative assessment activities in the asynchronous and synchronous online instructions. It was conducted at the end of the second live streaming meeting. Students chose from 1-5 (strongly disagree, disagree, neither agree nor disagree, agree, strongly agree) to describe their perceptions in the Likert-scale questions (see Table 8).
Altogether 54 out of the 60 students completed the questionnaire. Reliability test found the Cronbach's α coefficient of the questionnaire to be .952. After that, descriptive analysis of responses to the closed-ended questions were conducted. In addition, Pearson's correlation was conducted to examine the relationship between the scores of each item.

Student Engagement in Online Learning
Student engagement includes participation in both of the asynchronous and synchronous online learning activities.

Student Engagement in Tasks, Quizzes and Tests
As is depicted in Figure 2, a series of tasks, tests and quizzes were implemented in both of the asynchronous and synchronous online instructions. Table 3 describes student engagement in tasks, tests and quizzes arranged with iCourse and MuClass. Table 3 shows that 89.80%-97.96% of the 49 students have completed the asynchronous tasks and tests in due time. Five of them (10.20%) postponed the speaking module tests, resulting in the lowest submission rate (89.80%) among all of the tasks and tests; two students (4.08%) didn't submit the module test of listening; one student (2.04%) failed to submit the prepared speech, generating the highest submission rate (97.96%) among all of the tasks and tests.
Meanwhile, Table 3 also indicates that 91.84%-100% of the 49 students have fulfilled the synchronous tasks, tests and quizzes that were conducted during the two live streaming. All of the students (100%) had completed Quiz 2 and the online oral discussion; one student skipped the 1st listening quiz; three of them missed filling in the evaluation form on the exemplar speech; the lowest submission rate lies in the voting task (91.84%), as four of them missed the voting task delivered in MuClass. Informal interviews with these students found that they missed the MuClass tasks/ quizzes because they were absent from the live streaming meeting during the task/quiz time. Most importantly, these students had learned from these experience that live streaming meetings for the course of College English II require a concentrated mind.
Result of this part indicates that frequent online formative assessment activities could serve as a mechanism to monitor and diagnose student learning which was challenging in spatially separated classroom as claimed by Cheng et al. (2013).

Student Engagement in Video Watching
As learning contents in the open online course were released gradually, students could only have access to the warming up, listening and speaking modules before March 22. These modules contain 31 videos (1.487 hours in total length), and 18 online discussion forums. Students were required to finish watching all of the videos and participate in the online discussions at their own pace, path, time and space. Table 4 depicts student engagement in video watching and discussion forum in the open online course.
As is indicated in Table 4, the 49 students watched 32.388 times of the 31 videos on average, spending 2.151 hours in video watching. This result indicates that the students watched all of the videos as required and some students watched some videos for more than one time, spending an extra 0.664 hours on average in video watching. This finding supports previous scholars' claim that the flexibility of online courses enables students to learn at their own pace and path (Horn & Staker, 2014), and students liked to learn asynchronous online courses at their own pace, and most importantly, their learning outcomess appeared to be the same as in traditional face-to-face courses (Talent-Runnels et al., 2006).

Student Engagement in Online Discussion Forums
In respect of replies to the 18 threads in the discussion forums with iCourse, students contributed 9.714 replies on average, indicating that they had averagely completed 53.97% of the discussion forums. Some students have replied twice to one thread, for example, both of S7 and S18 contributed 21 replies; some students have almost replied to every thread, for instance, S21 contributed 17 replies, both S12 and S30 contributed 16 replies.
A comparison of the average replies to threads in discussion forums in the two semesters show that there was no significant increase in the number of replies (see Table 5).
Such result could be attributed to the fact that MuClass only counted students' reply posts to threads, without counting the number of reply-to-reply posts that recorded students' interactions with each other in the online discussion forums. In fact, the instructor in the present study had guided students to synchronously evaluate a same reply post during the live streaming meetings.
Take one of the three evaluation tasks in the second live streaming as an example, students were directed to make comments to one selected speech outlines posted by their classmates (see Figure 3). Figure 3 shows that 48 students had written down text comments, while 34 students had thumped up for the outline. In fact, this kind of instruction activity encourages students to comment on other's viewpoints and above all, helps them form a habit to interact with each other frequently on the online discussion forums. It is a great pity that data on such interactions were not available, hence a whole picture of student engagement in online discussions could not be drawn at the time being. However, the authors would like to claim that most of the 49 students did participated actively in the online discussion forums.
Finding in this part adds to previous literature an instructional intervention to ensure student engagement on both asynchronous and synchronous online instructions. As is reviewed by Rodrigues et al. (2017), an increasing amount of research has focused on using novel technologies to engage students. For example, Kangas et al. (2017) created online learning environments filled with digital technology and found them helpful in increasing student engagement in online learning. However, technological tools don't always work well in involving students (Boiselle, 2014: 1-14). Therefore, an alternative instructional intervention such as online formative assessment featured with frequent tasks, quizzes and tests would shed light on pedagogical design and practice of online instruction, especially live streaming meetings as a quick response to extreme conditions in education.
For example, Zainuddin et al. (2020) have used online quizzes as formative assessment and found them effective in engaging students in online learning activities. Similarly, Gikandi et al. (2011) have implemented online formative assessment by utilizing online quiz tools, online discussion forums, etc. and found them to be effective in improving student engagement. Table 6 describes students' learning performance in the two speaking tasks, two quizzes and two module tests completed in both of the asynchronous and synchronous online learning.

Learning Outcomes in Tasks
Generally speaking, most of the students had fulfilled the prepared speech (97.96%) and the online oral discussion (93.88%) as required and met the criterion of trying to perform well in terms of content, Note. Minimum score contains 0, which is granted to those who missed the quiz, test, or was totally off the point in the speech or oral discussion.

Figure 3. Screenshot of students' synchronous online interaction in a discussion forum
organization and delivery. Therefore, they received a mean score of 82.827 for the prepared speech and a mean score of 78.643 for the online oral discussion (see Table 6). The mean score of the prepared speech was dragged lower by one student who didn't submit his work (S6) and hence scored 0. Meanwhile, the mean score of the oral discussion was brought down by three students who scored 0: two of them (S36 and S42) carried out the whole discussion in their mother tongue rather than in English as required, whereas one student (S29) was totally off the point.

Learning Outcomes in Quizzes and Tests
When it comes to students' learning outcomes as represented in scores of quizzes and tests, Table  6 suggests that students' maximum scores in the two quizzes and two module tests had all reached 100, while their minimum scores were 0.
As for Quiz 1, only one student (S24) didn't fill in the quiz. However, mean score of Quiz 1 was relatively low (M=52.041), being dragged down by five students (S12, S28, S29, S42, S49) scored 0, ten students scored 25 and fifteen students scored 50. Informal interviews with some of these students found that Quiz 1 was the first real-time quiz they had come up with in their first live streaming, they spent some time in shifting from the live streaming to MuClass, saving them less than enough time to respond to the questions before due time of submission. Quiz 2 was released 30 minutes later and received a mean score of 96.531, indicating that they had become accustomed to handling real-time quiz in live streaming. Most importantly, the high mean score suggests that they had mastered the knowledge delivered in the live streaming meeting.
With regards to the two module tests, two students in the listening module test (S33, S39) scored 0 as they didn't finish the test, dragging down the mean score to be 95.510; while five students in the speaking module test (S6, S8, S9, S33, S38) didn't complete the test, hence scored 0, generating a mean score of 89.184.

Comparison of Student Performance in the Two Semesters
A comparison of students' scores in the listening test and prepared speech from the two semesters shows that students' scores in both skills were significantly better than before, implying that the use of formative assessment had a positive effect on students' learning (see Table 7).

Student Perceptions of the Online Learning Experience
According to Rodrigues et al. (2017), students' perceptions of online learning could be an important indicator of the quality of online instructions. This study collected student responses on their experience in the formative assessment conducted in both of the asynchronous and synchronous online settings. Descriptive analysis of the responses to the questionnaire was presented in Table 8, while result of the correlation analysis was provided in Table 9. As is indicated in Table 8, results from Items 1-3 show that students had very positive perceptions of the role of the videos in the self-paced online learning course, live streaming meetings, and the assessment activities in improving their engagement (M=4.18, 4.07, 4.04). This finding is in alignment with previous research finding that online formative assessment could be effective in improving student engagement (Cheng et al., 2013;Gikandi et al. 2011;Zainuddin et al., 2020).
When it comes to learning outcomes, results from Items 4 and 6 show that students quite agreed that the learning guidance were clear (M=4.04, 3.75). Furthermore, results from Items 5, 7 and 8 show that they quite agreed that the frequent tasks, tests and quizzes were helpful in enhancing learning (M=3.91, 3.93, 3.96). This finding supports Cercone's (2008) claim that frequent exercises and assessment helps improving the quality of online instruction.
As for the general perception, students approved that the workload of the tasks and activities was moderate (see Items 9 and 10); meanwhile, they agreed that the asynchronous and synchronous online instructions were closely connected and progressive (see Item 11). In addition, they showed positive perceptions of the whole course (see Item 12), indicating that the pedagogical design of the online instruction was well accepted.
It is noteworthy that students' perception on the 12 items in the questionnaire were significantly correlated with each other at p < .01. As depicted in Table 9, inter-correlation coefficients ranged from r=.443 (between E3 and LO1) to r=.782 (between LO4 and OE2), suggesting that a pedagogical intervention that aims at improving student engagement might meanwhile promote learning and bring about better learning experience as well, vice versa.

CONCLUSION AND IMPLICATIONS
As a quick and effective response to maintain the undisrupted learning during the COVID-19 crisis, the present study proposes the application of formative assessment as an intervention to engage students and facilitate their online learning. Such formative assessment is featured with frequent tasks, quizzes and tests implemented in both of the asynchronous and synchronous online instructions. Analysis and comparison of their formative assessment records in the Fall semester of 2019 and the Spring semester of 2020 indicated that such pedagogical intervention was helpful in engaging students in all of the online learning activities; in addition, such pedagogical intervention was effective in enhancing learning outcomes as represented in the higher scores in the tasks, quizzes, and tests. Furthermore, students had acknowledged the benefits of the formative assessment in improving their engagement and facilitating their online learning; and held positive perceptions towards the formative assessment activities in the course that was converted to be fully online in time of school closure.
The findings of this study generate a number of important implications for teachers involved in sudden transition to online instructions and for teachers who are interested in providing better online or blended learning experience in future: 1. Design frequent formative assessment activities.
The present study designed 4-8 formative assessment activities in the 80-minute live streaming meetings and found it effective in engaging college students and facilitating learning. Based on which, K-12 teachers are suggested to plan no less than 4 formative assessment activities within a 40-minute live streaming meeting, as young students are more easily distracted.
2. Design progressive formative assessment activities within a learning unit/module. The present study designed cohesive and progressive formative assessment activities in every learning module-the less demanding activities in the self-paced online learning course developed Note. *p < .05, **p < .01 lower-order cognitive (i.e., remember, understand) and the more challenging activities in the live streaming meetings cultivate higher-order cognitive skills, namely, apply, analyze, evaluate and create (Anderson et al., 2001). Questionnaire result of Item 11 shows that students highly appreciated such pedagogical design (M=4.05). Therefore, it is suggested that teachers design and assign frequent tasks, quizzes and tests that challenge students with increasing difficulty.
3. Predict student performance and prepare teacher feedback in advance.
Teachers might find it challenging to provide prompt feedback or instant adjustment during the live streaming meetings. Therefore, it is suggested that teachers predict student performance in the designed formative assessment activities, preparing teacher feedback and planning different versions of instant adjustment in advance. For example, in the first live streaming meeting, the first author had planned to replay the recording and invite students to resubmit their answers if 40% of the students failed the first listening quiz in the first listening; and she had planned to give tips if there were still more than 30% of the students failed the listening quiz in the second listening.

FUTURE RESEARCH DIRECTIONS
The present study is a preliminary research on applying formative assessment featured with frequent tasks, quizzes and tests to engage students and improve learning outcomes. It is a pity that controlled experiments are not available at the time being. Future research is suggested to compare the teaching effects of such pedagogical intervention in traditional distant learning or face-to-face learning.
Another deficiency of the present study lies in the fact that students' reply-to-reply posts were not accounted automatically by the learning platform at the time being. As a result, actual effect of the designed formative assessment on student engagement was not fully observed in the present study.
In addition, a natural progression of this work is to analyze how teachers adjust their teaching and how students adjust their learning tactics based on the constant feedback obtained from the frequent formative assessment activities.

ACKNOwLEDGMENT
We would like to thank all the participants for making this research possible. Our heartfelt thanks go to the editors and anonymous reviewers for their invaluable feedbacks on earlier versions of this paper. This work was supported by the Planned Projects of the 13 th Five-Year Plan for Educational Science in Guangdong Province, China (Grant No. 2018GXJK013).

STATEMENTS ON OPEN DATA, ETHICS AND CONFLICT OF INTEREST
The privacy of the participants was protected. On the one hand, all of the students were granted with a code when referred to in the article; on the other hand, students did not need to fill in their names in the questionnaires which guarantee the anonymity. Findings in the recordings of the oral tasks were only discussed within the research team (the authors of the paper and the two raters). No conflict of interest is anticipated to arise in the research.