upper waypoint

Moving Beyond Multiple-Choice: Using Digital Assessments to Track Learning

Save ArticleSave Article
Failed to save article

Please try again

Photo by US Department of Education via Flickr, CC BY 2.0

It’s time for the first assessment of the new school year, and I’m ready to challenge my chemistry students with a new and improved one: “Trends in the Periodic Table of Elements.” Not only is this test completely digital, it’s also free of multiple-choice questions. Instead of A, B, C or D choices, students will demonstrate their content knowledge through short numeric or drawing responses.

Not long after my students began the assessment, a hand went up. There was confusion about a question. After inspecting the question, which is about nuclear decay and half-lives, I realize I made a mistake in the numbers I provided.

Luckily, Goformative, the Web-based assessment program I use, allowed me to make changes even after the test-taking had started. I made the changes immediately and quickly informed students to refresh the pages on their tablets.

Every time I give a formal assessment, I suspect a few students are opening up other tabs and researching for information that might help them answer a question — something that is, obviously, not allowed. During this assessment, I was unable to catch anyone in the act of cheating, and I didn’t have the capabilities of seeing all of my students screens from my computer. But I was reminded that I needed to find an effective way to prevent students from opening additional windows on their browsers or accessing other applications while taking a test. Earlier this summer I was playing around with some possible solutions through Guided Access (an iPad setting), and I look forward to trying it in a future assessment.

During this first assessment, I opened a feature on my computer called Live Results, which allowed me to monitor students’ progress from start to finish. I could also look at each question individually and, on one screen, I could see all of my students responses, framed within text boxes for the given question.

Sponsored

Since this was a multiple choice–free test, part would have to be manually graded. But some questions, which required numeric responses, could be set up to be graded automatically by adding an answer key. But there was a minor hiccup.

As I scanned through the numeric responses, I noticed that some students were adding words after the numeric response — like adding “grams” after a number — and the computer was marking those answers as incorrect, even if number was correct.

While I could manually override the computer and give the students the points anyway, there was a better solution. I simply added more than one answer to the answer key. This came in handy if a student typed in the name of an element and slightly misspelled it or typed “Ne” (the chemical symbol for Neon) instead of the whole name, “Neon.” All these adjustments, on my part, were made as students were working on their test and were reflected in real time.

Next, I looked at the drawing questions, which asked students to draw electron dot diagrams, and I also went through the short-response questions about periodic table trends. There was seemingly a lot to grade, but there were two things that helped me complete all the grading for this assessment in less than a day.

First, Goformative collected students’ answers in a well-organized way, which really sped up the grading process. For example, I was able to grade all of the responses from a given question on the same screen, without the screen jumping to another page or going through a slow process to update after every change I made. A quick look at a student’s answer, a few clicks, and I was moving from student to student in a seamless and efficient manner.

Secondly, I got a head start on grading. Even before students finished, I was grading the finished parts of the assessment. I started grading from my iPad, which kept me mobile and still able to supervise my students’ test taking. This head start also meant that students would get quicker feedback. After I finished grading the test, I selected an option that allowed students to log in and check their results. I released their scores that evening.

The fact that I was able to make on-the-go adjustments to the test minimized some of the mistakes I’d made while creating it. Having student data effectively organized and ready for me to grade in an efficient manner greatly reduced my fear of giving multiple choice–free assessments. Next time I look forward to not only increasing the level of rigor in my assessments, but also making them more secure.

lower waypoint
next waypoint