Exploring the integration of automated feedback in EFL

Exploring the integration of automated feedback in EFL

Exploring the integration of automated feedback in EFL

Shu Huang and Willy A Renandya, posted on 6 July 2022

Click on the title to download the article.

Abstract

This article reports on an exploratory study which investigated the impact of automated feedback in a writing class among some Chinese EFL university learners who were at a lower language level. In particular, the study explored the students’ perceptions of Pigai, the largest and most popular locally designed automated writing evaluation system (AWE) in China, and the impact of integrating this AWE tool on revision quality of student texts. Participants include 67 students enrolled in two classes of a College English course at a Chinese university. One of the two classes was randomly selected as the experimental group (N = 35) and the other as the control group (N = 32). Data of the study include student texts of one academic writing assignment (the pre-test), revised drafts of another writing assignment (the post-test), and responses to a questionnaire. Quantitative and qualitative analyses of the questionnaire show that the lower-proficiency student participants generally thought highly of the feedback given by Pigai. Statistical analyses of the student texts, however, reveal that the integration of automated feedback did not necessarily result in observable improvement in the students’ revised drafts. Implications for writing instruction and the use of AWE technology in EFL writing classes are also discussed.

More free resources below

Is L2 Written Corrective Feedback Effective?

Feedback in L2 Writing Classes

Feedback that works

One Reply to “Exploring the integration of automated feedback in EFL”

Leave a Reply

Your email address will not be published. Required fields are marked *