Once we have established where students are expected to be as learners and performers by the time they take a test or transition to the next year – we then facilitate the school’s analysis of current status. This can vary from state to state depending on the nature of, and the data included with, the state report, but it will almost always begin with an analysis of district or school “success” rankings.
Did the school meet its state goal? (Note that the ultimate state goal is 100% proficiency in content areas. The standards apply to all students) Were there are recommendations or mandates that were included by the state in its report?
If it the school did not reach its goal, why not? Where were points lost? (Note: Ed Directions coaches use a simple chart to begin this discussion)
Tested Area
Last Year’s Score
This Year’s Goal
This Year’s Actual Score
Initial Thoughts
Why were the points lost? This requires a “deep dive” into the test report. In many states schools can learn:
possible weak areas of content
possible weaknesses in question type (e.g. constructive response questions), venue (e.g. reading with embedded visual or graphic) or duration (performance decline at end of test)
“red flag” areas (e.g. area of student proficiency under 50%, off level performance by advanced our honors students, performance gaps between and among identifiable groups, etc.) him
Some states provide more information but almost all states allow this level of data analysis. It allows Ed Directions coaches to use a simple chart to transition leadership teams from thinking about the scores to thinking about the students who earn the scores.
Concern
Data/Description
Initial Thoughts
In states where scores can be sorted by group or by teacher, this analysis can be carried to another level. We can identify areas of strength and concern with more precision. Ed Directions coaches use the questions below to help clarify what the scores actually tell us.
Were the proficient and non-proficient scores evenly distributed across teachers at a grade level or within a discipline?
Teacher grade level
Tested area(s)
Number proficient
Number not proficient
Score
Were the proficient and non-proficient scores evenly distributed across programs (e.g. honors program, special education, ESL, etc.)?
Program:
Test
Number proficient
Number not proficient
Score
Grade level: total number:
Did students with “at risk” characteristics (e.g. high absence rate, behavioral issues, lack of prior academic success, etc.) underperform?
At risk group:
Test
Number proficient
Number not proficient
Score
Grade level: total number:
Were there gaps between/among the “accountability” groups of students (e.g. social economic status, race, location, etc.) as compared to the “majority?”
Accountability Group
Test
Number proficient
Number not proficient
Score
Grade level: total number:
We can also begin an analysis of individual student performance and identify “red flags” in individual student score profiles. In this case the red flags indicate that the student did not meet expectations either by over performing or underperforming. A sorting tool used in many Ed Directions’ schools looks like this:
Student
Test:
Test:
Test:
Test:
Test:
Expected
Actual
Expected
Actual
Expected
Actual
Expected
Actual
Expected
Actual
At this point, we know district-wide and school-wide problems with not only curriculum congruence with standards, but also areas where we fail to meet goals and/or content and format weakness. These “red flags” give us a starting point for building an intentional plan that focuses on more precise areas needing attention than a “low reading score” notation on a state report.
Frank DeSensi is the founder and Chief Innovation Officer of Educational Directions, LLC. which consults with schools and school districts in the southeastern and mid-western United States. A retired educator, Frank spent 35 years in a variety of teaching and administrative positions. He taught at the university, college, secondary, and middle-school levels; worked in the central office as a curriculum specialist; and held both principal and assistant principal positions. From 1993 to 1998, Frank served as a Kentucky Distinguished Educator, helping to turn around schools that were labeled in decline or in crisis under the provisions of the Kentucky Education Reform Act. Frank helped develop the STAR training program for new DE’s and served as a trainer in the Kentucky Leadership Academy. He jointly holds patents for three data-management systems for schools.