Not-so shocking: “Students’ writing scores lower on computer tests”
I was on twitter yesterday when I noticed the following news article getting sent around, the headline screaming “Students’ writing scores lower on computer tests.”
Here is an excerpt from the opening paragraph:
How can this be? How can a generation who has grown up with computers, interacted with them, taken keyboarding and computing classes, how can THIS generation score lower on a computer-based writing assessment when this is their domain? their kingdom? Shouldn’t they have scored higher? The prompts were the same, the time constraints were the same, only the media was different. The experts in the district were puzzled and surprised.
What is surprising are not the results, but the explanations: the kids are used to spelling and grammar check (which was not available to them even in the computer based assessment), they were slower when they typed, they revised more often on the computer. The possibility that no one raised was the idea that the explanation was in the way the essays were scored.
I know of at least two studies (see below) which have pointed out that typed essays consistently are scored lower than handwritten essays. Take the very same essay, give it to a trained rater to read in the handwritten version and then give it to them typed, and they will give the typed version a lower score.
It seems to me that when we are considering different types of assessments for our students, we should be sure of what we are assessing. In this case, the assessment told us that the raters prefer handwritten essays.
- Students’ writing scores lower on computer tests
- Computers and Assessment: The Effect of Typing versus Handwriting on the Holistic Scoring of Essays. Sweedler, Carol. (1991)
- Comparability of Scores on Word-Processed and Handwritten Essays on the Graduate Management Admissions Test. Breland, Hunter, Brent Bridgeman, and Mary Fowles. (1999)