Another Lesson Learned – Probing Beyond Raw Test Scores

A revelation came at a school where it appeared that very few students understood science. In the process of probing beyond raw scores, we again found something that forced us to drastically change our approach to analyzing test scores. In this school, we helped the school leadership team probe into patterns of responses and samples of student written responses. We found in almost 60 percent of the students who appeared to be less than proficient, some variable other than content knowledge was the major issue.

Simple disaggregation of the data did not adequately explain why student performance was low. However, when we looked closely at each student’s performance on the test and compared performance to the teacher’s estimated performance, we found some interesting patterns. In the following chart, we included three sample students from this examination of 120 different student response patterns. In all three cases, the multiple-choice response indicated a low level of content knowledge about science.

 

Student 1 only got six questions correct. His science teacher noted that he was not very involved in the education process, so this was an excellent score for him. Student 2, in contrast, scored lower than the teacher expected. He was a good student and motivated to be perfect when tested but he was also very insecure and test anxious. He was bothered by distractions and had a hard time getting back on task after interruptions. Student 3’s marks were a shock to everyone; he was a straight-A student who planned to go to medical school. He had attained the highest science grade in the school for three years running and had completed more science credits than any other student in school, yet, he failed to exhibit a high level of content knowledge.

These students — different in socioeconomic status, program placement, and academic performance — all scored in the same range. The school, true to its remediation plan, placed all three in science remediation. After all, didn’t the data indicate all of them needed it? (Some of us did question why students had been placed in science remediation when those students would not be tested in science in the current year.)

As we analyzed the work of Student 1, it became obvious he used the Christmas tree pattern to fill in his answers: A, B, C, A, B, C. His score was a false read because his performance was unrelated to his actual knowledge.

After checking the student records, teachers discovered the student had passed no classes in his high school career and regularly gave no effort during the lessons. He missed 22 days of school and was tardy to science class 11 times. In this case, we had no idea whether or not the student knew any science, but we were sure his attitude was a problem. We were also sure science remediation was not the intervention he needed most. Troublingly, when asked about current students who had similar profiles, the staff had no idea how students were grouped beyond raw scores. A review of their records showed many other low-performing students shared this profile.

In the case of the second student whose data we probed, when asked what happened, he responded that he was disappointed and said he had spent a lot of time on one question and then had run out of time and guessed on the rest. The teacher remembered several disturbances had occurred during the test and testing obviously made this student anxious. Again, we uncovered a false read of student knowledge and performance level, and again, we determined science remediation would not solve the problem. When we asked about other students in this year’s science class with similar test anxiety and attention issues, again, the school did not know.

We had no idea what happened to Student 3 until someone figured out that he knew the correct answers but marked them out of sequence on his answer sheet. When he was shown his work and questioned about his performance, he was surprised but didn’t seem disappointed. When asked about why he didn’t catch his mistake, he said, “Nobody told me to check my work.” Here, a cavalier attitude about the test, inadequate experience, and poor test-strategy preparation seemed the major issues. Science remediation was, yet again, an inappropriate intervention for this student — he probably could have taught the class. His score, like the other two students’, was a false read of his knowledge and performance level. When asked how many other gifted students had taken little interest in the test or how many had poor test-taking strategies and had scored below their potential, the school had no idea.

Because of this deeper inquiry, we found several patterns that could better inform planning. Attitude about the test, time management, format facility, and testing confidence all emerged as school-wide issues that were every bit as problematic as the curriculum itself or how much science the students knew or didn’t know. For the school, this aha moment led to a new and very different planning cycle.

It required a great deal of soul searching and pushback, but we came to understand that we couldn’t use test scores the way we were using test scores. Using the gross scores or simple disaggregation by race or socioeconomic status did not give us the data we needed. We did discover that there were some things that we could learn from analyzing test reports. One of the most important of these was how to analyze the scores to identify red-flag issues. By looking closely at reports to detect patterns, scores, or cells that were either well above or well below our expectations, we could pinpoint areas needing more evaluation and probing. The test score was a starting point in our exploration, but we had to figure out what the scores meant before we could identify the root problem and build intentional plans to address it. With limited time, we cannot afford to waste any on the wrong problems. Fixing something quickly is no substitute for fixing the right thing well.

Is it time to take a more intensive look at your test results?  Need a partner to help you decipher the data and come up with a an actionable plan to improve test results? We can help with academic management, teacher professional development and more.  Contact us via our website today.

 


Our book Turning Around Turnaround Schools is intended for those who work in or with turnaround schools, though the approach to education presented works with any type of school (public, private, charter, and so on) at any level (struggling, passing, blue ribbon, and so on). We hope to share the lessons we have learned and describe strategies and processes that have proven successful for us. Most importantly, we hope to provide for you a set of understandings and tools that will guide your work and make you a more intentional, effective agent of change. Find it online at: https://www.amazon.com/Turning-Around-Turnaround-Schools-Conventional/dp/1948238020/ref=sr_1_1?ie=UTF8&qid=1537974543&sr=8-1&keywords=turning+around+turnaround+schools This and other topics are covered in our book.

About Post Author