Major support for MindShift comes from
Landmark College
upper waypoint

Could Rubric-Based Grading Be the Assessment of the Future?

Save ArticleSave Article
Failed to save article

Please try again

 (iStock)

Institutions of higher education are under pressure from students and employers to prove that graduates are gaining the cross-cutting skills -- such as critical thinking, problem-solving, communication and quantitative analysis -- necessary for success in the real world. Now, a consortium of 59 universities and community colleges in nine states is working to develop a rubric-based assessment system that would allow them to measure these crucial skills within ongoing coursework that students produce.

Right now, some universities require a small sample of their students to take a standardized test before graduating, but many administrators and faculty find this method problematic. Students have no personal investment in the test, and it is divorced from the coursework that they see as their primary objective while in college. Also, not everyone does well on tests, but they may shine in their coursework.

Concerns over the effectiveness of standardized tests prompted the Association of American Colleges and Universities to begin working on a rubric-based alternative that is consistent and valid.

First, they set out to define the essential learning outcomes that faculty, employers and accreditors saw as important. They settled on 16 qualities, some of which are: critical thinking, writing, quantitative literacy, oral communication, ethics, teamwork, intercultural understanding, and integrating learning from one area to another.

For the first-year pilot study they focused only on three of those outcomes: written communication, critical thinking and quantitative literacy. The faculty worked together to write rubrics (called Valid Assessment of Learning in Undergraduate Education or VALUE rubrics) that laid out what a progression of these skills looks like. The rubrics were tested on campuses and rewritten three times before reaching a final version.

Sponsored

Once the rubrics were set, faculty from all 59 universities were trained on how to use them. They went through norming sessions where each person would score a piece of student work using the rubric, and they’d come together to make sure people were assigning a similar grade.

“I really trust the process of preparation that the scorers have,” said Kathy Wills, associate English professor and director of the writing program at Indiana University-Purdue University Columbia campus. “Having gone through the norming session, I see how it works and that you can really get to a fair assessment of student work.”

In a pilot study of the rubrics, 127 trained scorers evaluated 7,000 samples of student work across a variety of disciplines. Because they were grading the cross-cutting skills of written communication, critical thinking and quantitative literacy, faculty evaluated work from disciplines that were not their own.

“These rubrics are designed to be cross-disciplinary,” explained Bonnie Orcutt, associate professor of economics at Worcester State University and temporarily the director of Learning Outcomes Assessment for the Massachusetts Department of Higher Education. “I can look at something and have no idea if the content was correct, but that’s not what I’m looking for. Independent of whether the content is correct, they may have used a body of evidence really well, have good organization, good syntax, good citations.”

In other words the facts might be all wrong, but the person is a good writer, which is what the scorer is trying to evaluate with this rubric.

The pilot study results are encouraging. “This is something that’s possible for campuses to do and is something that could be delivered to states for decision-making,” said Terrel Rhodes, Association of American Colleges and Universities vice president.

The organization is now working with assessment experts to design an even more rigorous study of rubric-based grading, but Rhodes says the pilot study results have faculty and administrators feeling confident.

ASSESSMENT THAT IMPROVES TEACHING

An unexpected consequence of the study has been the examination of teaching practices in higher education. “Faculty get invested in what they’re getting out of it,” Rhodes said. “They’re actually getting information that they’re turning around and using to change the way they’re teaching and the assignments they are using.”

That never happened with the standardized tests, which spit out numbers but don’t offer the formative feedback that professors need to target specific areas for improvement.

Professors began realizing how much the language of their assignment prompts communicated what they expected from students. That might seem obvious, but without other samples to compare to, professors just thought their students didn’t have the skills.

“When you see the differences in the kind of responses you get from students from one prompt to another, that made me even more aware,” Wills said. She’s extra careful to write clear prompts and has workshopped assignments with her colleagues in the English department to be sure it’s clear what she wants from students. She’s also trying new kinds of assignments.

For example, in a creative writing poetry class, she had students bring in objects from nature or identify smells before they ever wrote a written word. She wanted them to have a visceral understanding of sensory experiences and bring that to their writing.

“It helped them understand the concept,” she said. Later, when she critiqued their writing for being too abstract and told students they needed more sensory imagery, they knew what she meant. “I was able to pull on a base that I had created. It made my teaching easier in some ways.”

The pilot has also gotten the faculty thinking more about how they are going to incorporate cross-disciplinary skills throughout their content areas. “You can’t teach writing or critical thinking without some content, so you’re thinking how can I use this content as the vehicle to enhance students' critical thinking, quantitative thinking skills, etc.," said Orcutt.

She believes the rubric-based assessment project has been profoundly helpful for not just identifying how well universities are teaching these skills, but also for shining a light on what’s happening in the classroom.

BRIDGING THE GAP TO K-12

So far, this rubric work has been happening only at two- and four-year universities. But the conversation happening in higher education isn’t so different from that going on in K-12 schools. Parents and teachers are pushing back against blunt assessment instruments like standardized tests, and are looking for a way to hold schools accountable that doesn’t mean taking time away from class work.

“The important thing here is it’s authentic work and not just teaching to the test,” Wills said of the pilot study. “It’s real teachers and real work.”

Many K-12 educators and parents would like to see a similar type of system in their schools. Many welcome assessment and see the need to make sure kids are learning, but they’d like to see those evaluations happening based on the work students produce for class in context that they care about.

Wills acknowledges that a lot of the new focus on cross-cutting skills in college comes out of work going on in K-12 education. She’s been working closely with other educators in Indiana to communicate university expectations to K-12 schools and help align curriculum.

Rhodes says that while K-12 teachers weren’t explicitly involved in constructing the rubrics, they were brought in later as part of a group testing the ease of use.

“All of them were able to utilize the rubrics, look at student work and come to high levels of agreement about the quality of the work,” Rhodes said.

Sponsored

K-12 teachers are also looking at the rubrics on the website, although Rhodes doesn't know for sure how they are being used. He sees this as a positive sign that, down the road, a similar rubric-based assessment system could be used for these types of skills at all levels.

lower waypoint
next waypoint