Skip to content

4906 Bardstown Road | Louisville, KY 40291 | 502.373.2700 

Is Knowing The Test Answer Enough? Another Lesson Learned From Years Of Academic Partnerships

During the early years of our company, we worked with several middle schools. Their data indicated social studies students were consistently scoring low in the US Constitution portion of the test. All of these schools had labeled this as an area of priority need, and they had included in their plans several curriculum and professional development (PD) initiatives to address that specific problem. A few schools even decided to spend considerable money on a more engaging and interactive program.

A year of implementation passed while the new curricula and methods were employed. However, when the data came back from that year’s test, even though each school reported that the units on the Constitution were tremendously successful, there was barely any improvement in student performance. Students were more engaged, and even teachers enjoyed the lively lessons and support that the program provided. But, why hadn’t the scores improved? The students were obviously taught well, and their classwork showed they had learned and mastered the content.

Deeper probing of the data raised a few red flags. When looking at results from previous years, we noted the students usually scored high on the multiple-choice parts of the test — high enough to infer they knew a great deal of the content being tested. However, they were scoring very low on the constructed responses — the written portions of the test.

It just so happened, each year, the test had one long, scaffolded, constructed-response question on the Constitution. We interviewed students and discovered many would blow off this constructed-response question, answering only a part of the question or not answering it at all. The students gave several reasons for this, but mostly, they hadn’t seen that type of question before and didn’t have the desire to invest the time and energy on written responses.

This was a transformative moment in the evolution of our approach.

The schools accurately identified an area of concern. They took a commonsense step to the conclusion that they needed to do a better job of teaching a weak area; then they developed good plans to correct the teaching. We had no doubt the content of the curriculum improved, and the adults became better deliverers of content; however, they were already doing well at teaching the content, as evidenced by the relatively high multiple-choice scores. The real cause of the low subdomain score was performance on one type of question — constructed-response — and in probing the students, we discovered a large number of them struggled with those types of questions in all content areas. In many cases, the struggle was related to the fact that they had never been asked to respond to complex questions as a part of their regular schoolwork. They had no experience answering questions that included three, four, or five thinking steps, and none had experience with written responses that demonstrated that thinking to a reader.

The schools initially framed the problem in terms of assumptions. School leaders assumed lower test scores were the result of inadequate content knowledge. They developed plans to fix the problem, but the plans did not work. When the school leadership probed deeper for more specific causes of the low scores, they uncovered serious student performance issues around one type of question. Sure enough, when they continued probing, they found the problem fundamental to this type of test question not only affected social studies test scores but all subject test scores where these types of questions were used. This new problem, when effectively addressed, had the potential to leverage results across the board, not just in social studies.

Afterward, we were pleased to see improvement plans focused on written-response performance resulted in higher test scores for these otherwise-stagnant schools. Not only did their overall scores improve, but with teachers embedding such questions into their ongoing classwork, students’ critical thinking and written expression skills increased exponentially as well. The complex thinking required by such questions amplified the rigor of student work in many classrooms. We found it helps to solve the right problem.

When evaluating why a school isn’t scoring well on standardized tests, it’s important to keep in mind that simply knowing the answer to the test isn’t always enough to generate high scores.  Taking a holistic approach to evaluating student performance is the key to finding the weakest points of your testing preparedness strategy and overcoming those challenges.  Need help in evaluating your testing preparation strategies?  Contact us today – we would love to have a conversation with you about how we can work together.

 


Our book Turning Around Turnaround Schools is intended for those who work in or with turnaround schools, though the approach to education presented works with any type of school (public, private, charter, and so on) at any level (struggling, passing, blue ribbon, and so on). We hope to share the lessons we have learned and describe strategies and processes that have proven successful for us. Most importantly, we hope to provide for you a set of understandings and tools that will guide your work and make you a more intentional, effective agent of change. Find it online at: https://www.amazon.com/Turning-Around-Turnaround-Schools-Conventional/dp/1948238020/ref=sr_1_1?ie=UTF8&qid=1537974543&sr=8-1&keywords=turning+around+turnaround+schools  The ideas in this post are explored further in our book.

 

About Post Author