Breakthroughs in Pedagogy: Analyzing student data to make instruction relevant
Dawn Pipkin is an instructional resource teacher and National Board Certified Teacher at Leonardtown Middle School in St. Mary’s County. She represents MSEA on MSDE’s Professional Standards in Teacher Education Board and serves as its vice-chair.
Educators are harnessing the power of technology each day in order to facilitate quality instruction and assess student learning. We must now not only hone our skills in understanding the standards and planning engaging lessons, we must master a new skill — data analysis.
In my role as a school based instructional resource teacher, I work with classroom teachers daily to help them analyze the classroom, local, and state assessments that will inform the instructional decisions they make. Assessment literacy is not always included in undergraduate programs, but using data to inform and monitor instruction is an expectation of educators across Maryland.
When we think of assessment literacy we should be considering several factors. Looking at the way a particular class answered a question can give us information on common misconceptions that students have about the text they have read, the concept that is being taught, or the skill that is being assessed.
For example, if the large majority of a class picked answer B and the response for B was a result of a common mathematical error that students often make, then this math mistake would be a great place for their teacher to start the lesson the next day.
I recommend watching the video “My Favorite No: Learning from Mistakes” from the Teaching Channel to see how the teacher incorporates using student misconceptions into a powerful lesson that engages every student. Through using student work as the basis for the lesson, the class has the benefits of getting both positive reinforcement as well as another opportunity to correct misunderstandings.
Another facet of assessment literacy that teachers must consider is when a large portion of students miss the same answer. In this case, there could be a problem with the way the question is being framed that leads to the wrong answer. We must always consider not only what wrong answers show us about student learning, but what those answers are showing us about our own work in test development. Creating quality questions is a skill and examining not only the student responses, but our own thinking, will yield better assessments.
In the case of selected response or multiple choice items, the length of answer choices, the plausibility of answers, and not having choices that are so out of the realm of possibility that they indicate they are obviously wrong, all contribute to quality item development. When test items are poorly constructed, correct answers are not necessarily a result of quality learning, but faulty item writing leading students to choose the correct answer simply because some of the choices were easy to eliminate.
Lastly, as teachers we are all focused on advancing student achievement and closing the achievement gaps that exist. Analyzing assessment data is one of the strongest tools we have in order to create instruction that is more targeted to student needs. In a time of complex standards that focus on application, it is important that we focus our teaching time on areas where students have demonstrated needs.
Too often we feel pressed to “cover the curriculum” and we engage in the next lesson before we know that our students have mastered essential learnings. Analyzing student data prior to planning has the ability to make our instruction more relevant to student needs.
If we engage in teaching lessons without thoughtful analysis of data, we could be missing valuable opportunities to correct misconceptions and build a stronger educational foundation for all of our students.