Spencer grant to improve assessment of students’ writing skills
Over the past three years, Joshua Wilson, assistant professor in the University of Delaware’s School of Education (SOE), has studied how computer software could be used to evaluate and improve student writing, and his research continues to garner national attention. This fall, Wilson received a two-year research grant from the Spencer Foundation in support of his work to develop new writing evaluation algorithms for automated essay-scoring systems (AES) designed to improve writing skills of K-12 students.
The Spencer Foundation investigates ways in which education, broadly conceived, can be improved around the world. It encourages high-quality investigation of education through research programs and supports the educational research community through fellowships, training programs, and related activities.
While AES systems already exist, Wilson’s research is unique in its aim is to deliver formative evaluations — assessments that reveal underlying gaps in student knowledge and areas for improvement rather than only indicating the success or failure of a student’s writing.
In a study of 480 students in grades 6 and 8, Wilson and colleagues successfully defined a formative evaluation model using automated measures of students’ writing. Further, there was initial evidence that students’ scores on these levels-of-language measures predicted students’ performance on a state writing assessment.
“A teacher assigns a writing project, which the students complete using the AES software,” Wilson said. “Students submit their drafts to the program for instant scoring. Formative AES programs not only provide scores, they also identify areas of improvement, such as lower-level concerns like spelling and grammar, and higher-level concerns like organization and cohesion. Students correct the errors to get a higher score, while learning from their mistakes. Teachers can complement the AES feedback by evaluating additional higher-level language requirements like idea development, elaboration, and style.”
Teachers use the results of an automated formative assessment to provide meaningful feedback and strategies for improvement, allowing their students to grow as writers. These automated evaluations help teachers respond to student work more quickly and could eliminate implicit biases.
Further research program on AES systems
The Spencer grant will allow Wilson to expand previous research on a new formative assessment model that identifies student language skills at the word, sentence, and discourse level through automated evaluations of the essay’s word choice, syntax, and cohesion.
Wilson will pilot this model in grades 3-5, examine results across multiple genres of student writing, and further test the ability of the model to predict scores on state writing assessments.
“Formative writing assessments are almost universally recommended,” Wilson said. “However, educators use them infrequently because none are simultaneously efficient, reliable, and informative. AES systems excel in the first two of these categories — they’re extremely efficient and reliable — but there is room for improvement with respect to how informative they are for teachers. This is where I’m focusing my research.”
In addition, Wilson has studied the effect of current systems in the teaching and learning of elementary and middle school students, using the PEG Writing assessment program. His preliminary research has shown that the quality of students’ writing improves in response to the software’s feedback, and it appears to be equally effective for students with different reading and writing skills.
This summer, Wilson also received a $399,999 grant from the Institute of Education Sciences (IES) in collaboration with Charles MacArthur, professor in the SOE, and Gaysha Beard, supervisor of English Language Arts at Red Clay School District, to study the effect of existing AES software on the writing of 2,880 Red Clay students in grades three through five.