by Betsy Denson
As a culture, we love to rank things in order to estimate their value. Numerous sports analysts have issued their rankings of NFL Draft prospects. U.S. News and World grabs headlines each year with its numerical run down of hospitals and colleges – high schools too. Houston was shouting from the rooftops when Forbes magazine named us Coolest City a few years back.
But what if the rankings don’t go your way. The recently released CHILDREN AT RISK report which ranked, and graded, Houston’s schools as part of its Texas Public School Rankings has got people talking because it concerns the thing that matters most to parents – their kids. Where to send them off to school was not a lightly made decision.
Realtors start getting more questions from prospective home buyers about zoning. Principals get phone calls asking what’s with the slip down the ladder. People start comparing notes on Facebook.
What does it all mean? CHILDREN AT RISK is a 20-year old non-profit advocacy organization which has done good work focusing attention on the needs of Texas children. In 2006, they developed the school ranking system to start a dialogue about how to improve schools. The methodology changes slightly from year to year, but it is largely based on test scores – for 2014 it is largely the STARR Reading and Math tests. For high school, graduation rates are also a factor.
How did Leader schools do?
In the Leader area, there were six schools which scored an A or B rating. For elementary schools, Oak Forest Elementary received an A+ and was numbered 61. Three Heights schools were also represented: Harvard received an A and came in at 87, Travis Elementary scored an A and was ranked 128 and Field Elementary received an A- with a score of 150. Timbergrove’s Sinclair Elementary got an A- and a 156 ranking. There was one middle school, Hamilton in the Heights, with a B- rating and a spot at 159 on that list. No Leader high schools scored an A or a B. The highest ranked Leader high school was Reagan with a C- and an 90 rank. Complete listings can be found on the CHILDREN AT RISK’s website.
Study authors caution against comparing schools from year to year because of changes in their methodology, but’s it’s hard for parents not to, and then to scratch their heads. Oak Forest Elementary fell from 6 to 61. Travis from 76 to 128. Love from 382 to 589. Garden Oaks from 107 to 495.
Study co-author Caroline Neary, who is also the assistant director at CHILDREN AT RISK’s Center for Social Measurement & Evaluation, said that there are only a few points separating their top tier schools and encourages parents to focus more on the A grade than a ranking. Then why rank them in the first place? And for schools that took a 200 to 300 point dip, did their all students suddenly forget everything they knew? Did the teachers vacate the building?
What about the newly earned IB status at Durham Elementary and Hogg Middle School? What about Tim Weltin, a former lawyer who gave up his job to work at Black Middle School because he believes so much in the culture of excellence they are creating?
And what about principals who are working hard and seeing positive changes at their school, only to be presented with a ranking that can devalue their school in the eyes of others?
Stevens Elementary Principal Lucy Anderson said that while they weren’t thrilled with this year’s Children at Risk Ranking, they will use it as an opportunity to review their practices and find ways to improve for next year. “There are many ways to measure a school’s success,” she said. “According to the TEA, our school has ‘Met Standard with Distinctions’ in two out of three areas. We can’t be discouraged by this one measurement, we have to continue finding the best ways to educate our students.”
But while schools take note of the results and move forward, parents of pre-school children can take them as gospel and possibly miss out on a school which might be a great fit for them.
“I understand that can happen, and it’s an undesirable side effect,” said Neary. “It’s great when schools and communities are already working to turn things around, and it is certainly not our intention to harm that process. However, more frequently schools are not actively trying to turn themselves around until parents see these rankings and demand a change. The rankings are often a catalyst for that parental involvement.”
What should be the benchmark?
Some say that old results shouldn’t be the basis for the study. Katherine Heinrich, part of the ‘Learn Local’ group at Hogg thinks that “for neighborhood schools that are in the process of a turnaround, it’s a shame that this respected nonprofit uses data that’s nearly a year old (from the April 2013 STAAR results) in what it touts as its ‘new’ 2014 rankings.”
Others don’t think the tests should play a primary role. As one Heights mom wrote whose school of choice is Wilson Montessori in Montrose, which has a waiting list of 300 and a B- score from CHILDREN AT RISK: “I beg of CAR to start challenging this system, not buying in to it and ranking perfectly wonderful schools who focus on the whole child and not as much on the test low on your list. Talk to parents and students. Are kids learning how to think, how to be creative, and how to be kind and responsible global citizens? That’s what matters…especially in low socioeconomic areas.”
With regard to students at an economic disadvantage, it’s worth noting how CHILDREN AT RISK takes that into account. Neary explains that the STAAR scores are evaluated from two angles. The Achievement Index looks strictly at raw performance – what percentage of students scored at the Advanced level. The Performance Index looks at how a school performed as compared to how they would expect them to perform based on their level of economic disadvantage. “It goes both ways,” she said. “If an affluent school shows average performance on STAAR, but based on their level of economic disadvantage we would expect them to perform well, they are going to have a lower performance index score.”
Here’s her simplified illustration: “Say School A has 90% economically disadvantaged students and School B has only 25% economically disadvantaged students, but they both have 32% of students scoring at the Advanced level on STAAR reading. If we are just looking at raw achievement, they are doing the same – but we can assume that the low income school is working harder to get that score. The low income school is going to get a higher adjusted reading deviation score and do better in the performance index than the affluent school.”
An example of this in The Leader area is Field Elementary which earned an A- with its 92% of low income students. But it was something else that initially appealed to Patty McGrail, who has been working with others to educate the public about Field because it was her neighborhood school.
“I saw how well the teachers liked the principal, and at the end of the day when they should be tired, they were laughing and having fun. I thought that they were the kind of teachers I’d want to teach my son.”
There are no doubt a variety of factors that contributed to Field’s testing success, including student effort and teachers “going the extra mile,” according to McGrail. The school proudly notes their CAR Gold Ribbon Award on their website. Perhaps next year Heights residents who wouldn’t have previously considered the school will give it a serious look. And maybe a few years after that, it will be a school richer in diversity with increased resources to achieve its goals.
Because rankings are a funny thing. We might say we don’t care about them. We might say we don’t agree with them. But we still listen to them. Maybe it’s time to turn down the volume a bit.