As a high school history teacher, I’m committed to helping my students improve their writing. But, like many of us, I struggle to find time to provide meaningful, timely feedback and assessment. A few years ago, I discovered automated writing feedback programs, and everything changed. Now, students are not only more engaged in class writing projects, but are working more collaboratively. My role has also changed. No longer a grammar or rubric enforcer, I can now work with my students as an ally to make sense of both peer and computer-generated feedback.
Automated writing feedback programs are web-based programs that enable me to double the amount of writing I assign in my classes while placing the job of evaluation back on my students. The number of free and student-friendly automated writing feedback programs has multiplied over the years, and many focus on specific elements of the writing process.
- Grammarly claims to find and correct ten times more mistakes than a word processor.
- The Hemmingway App makes writing bold and clear by focusing attention on the number of adverbs, use of the passive voice and readability of sentences.
- PaperRater offers feedback by comparing a writer’s work to others at their grade level. It ranks each paper on a percentile scale examining originality, grammar, spelling, phrasing, transitions, academic vocabulary, voice, and style. Then it provides students with an overall grade.
For the past few years, I’ve experimented with having my students hand in final drafts of their writing after they have: (1) run drafts through each of these three automated writing feedback programs and (2) run improved drafts past their peers. My students go through this process to improve each draft of their writing before I ever look at it.
Automated writing feedback programs have changed the game. I no longer have to do the dirty work of correcting spelling or grammatical mistakes. Now students seek me out to help them address the feedback from the computer. I have found that students in my class will revise their writing per the advice of PaperRater in order to increase their final score. As a teacher, I am secretly thrilled when a student tells me, “My first PaperRater score was a 78, but I went back to the textbook and added 8 more vocabulary words and my score went up to 86.” While this may sound like “gaming the system,” the trial and error process increases the amount of practice that students get and improves their writing proficiency. Studies have shown that students working with writing feedback tools write more than three times as much as their peers who aren’t using these tools. This increased effort with the revision process helps students see writing as an iterative process instead of a one-and-done assignment
After giving my students a period of time to run drafts through one or more digital writing feedback programs, I ask them to write a Revision Memo. Modeled after the work of Bardine and Fulton (2008), revision memos provide students with a chance to reflect on the significant changes they have made throughout the writing process. Then they examine how effective their changes were in improving the quality of their writing, prior to having a peer validate or question their choices. View several examples of my students’ revision memos here.
After completing their revision memos, students run their now revised draft past their peers. Research suggests this should be carefully structured, with a learning design that includes phases of activity, peer assessment, reviewing and reflecting. Author Susan Brookhart recommends student-generated rubrics to allow for highly effective peer grading systems. I include criteria charts containing the historical details that I was expecting to see in student writing in order to guide students in determining important details. I also use free online polling tools like SurveyMonkey, Poll Everywhere, and Google Forms to create peer review protocols.
This focuses students on a significant feature of the writing assignment, whether it was creating a compelling opening to a speech, or including academic vocabulary. This also transforms what used to be a lonely, individual grading and feedback process into a student-centered, collaborative activity.
Peer review is also called learning by evaluation. It significantly improves a student’s self-assessment abilities and lays the groundwork for self-improvement. Peer review protocols should focus on only one or two aspects of effective writing, include student discussion to drive reflection, and allow increased instructional time for student revision. Single point rubrics help students describe exactly what a writer needs to focus on. I have varied the scope and sequence of revision memos and peer review and found that the amount of class time necessary to conduct these activities decreases as students become more proficient with the process.
Many teachers, especially those of us in low-income, public schools, struggle to increase the amount of writing assigned and provide effective feedback that motivates students to revise their work. We need more cooperation and collaboration between the disciplines to find an appropriate balance for giving both computerized and human feedback to our students. Turnitin has conducted extensive research on how feedback impacts the development of student writing. Chief among student complaints about teacher feedback is the length of time teachers take in grading and returning their papers. Automated writing feedback tools and peer review activities can help teachers reduce this problem. Too many teachers focus solely on the plagiarism detection features in automated writing tools. I believe they are missing an opportunity to prolong critical thinking on meaningful writing tasks. Imagine the impact on student achievement if all teachers would leverage technology to improve discipline-specific writing?