This morning’s New York Times (I read the paper version, I know, I’m old) brought news of new research out of Stanford that should shape the way we think about what counts as a successful school district. The article linked above, which you should absolutely read, does a good job of breaking down rock star education researcher Sean Reardon’s new study on educational opportunities. Herein, I give you a brief summary of the study, because it’s very clever. This part is a little nerdy. Then I give you the good news. Because I feel like a lot of times when we talk about the Montgomery schools we are full of bad news. Finally, I give you the bad news. Because there’s some pretty bad news in this study. Then I open the floor for discussion, if anyone out there in Lost in Montgomery-land is still listening. I know it’s been a while since we’ve written.
So, first, what is this new study and why does it have me so excited? Two things, really. First, it’s really big. It looks at 45 million students in 11,000 school districts. That’s a lot of data. Second, it’s methodologically innovative. Generally, studies that look at how well schools are doing just look at average test scores. That’s how Alabama decides which schools will be called “failing” for its
private school giveaway tax credit.
This study calculated average test scores in third grade, but then went a giant step further and calculated growth rates in test scores from grades 3-8 for twelve different cohorts of students. This is a measure of student improvement over five years, and it’s meant to take a closer look at what, compared to the entry baseline, schools are adding to a student’s experience. Think about it this way: if everything performed as normal, a student should grow five years’ worth of schooling in five years. If a school is excelling, a student could have a higher growth rate. If schools are underperforming, a student would get less than five years worth out of their five years of seat time.
If average test scores were a good measure of school performance, we’d expect growth rates to be correlated with average scores. The thing is that those two measures – growth rates and average test scores – turn out not to be correlated. Which means, in the first place, that we should not be using average test scores to identify so-called failing schools, as they are likely to be simple reflections of socioeconomic status. This kind of ranking fails to incentivize growth and, as Reardon argues, may drive parents into districts with higher socioeconomic status, increasing economic segregation (ahem, Pike Road).
Before we get to changes in test scores, let’s see how Montgomery’s average scores stack up nationally. Here’s where our third graders stand.
We’re almost three years behind the national average now, while Chicago’s getting pretty close to average. So Montgomery students are falling behind, while Chicago’s seeing closer to average test scores. Here’s what the change in scores looks like:
And here’s how to read this chart (all charts from the NYT interactive I linked to in the first paragraph). Chicago students are getting about 6 years worth of test growth out of their five years of schooling – a pretty remarkable achievement. Montgomery students are only getting three years – putting us in the bottom 1% of school districts nationally.
But wait, you might be saying, I thought there was going to be some good news here. There is. This study’s findings contain a few things that should give us hope that MPS can be improved. First, it finds that socioeconomic status is not destiny. A relatively poor (81% free and reduced price lunch) and racially segregated school system (10% white) like Chicago’s can make big gains, even though (unlike Montgomery) it has a large number of English language learners (about 17%).
And it means that big increases in per-pupil spending (while good) aren’t necessary. The Saraland schools spend some of the least in Alabama (only $7,789 in 2016) but saw 4.8 years of growth after 5 years. By contrast, wealthy Mountain Brook spent $12,162 per pupil and saw 4.9 years of growth in the same period. That’s not a lot of return on investment. Montgomery only spends $8,420 per pupil, and Chicago’s $12,000 seems like a lot until you realize that a) the cost of living is a lot higher there, and b) that’s still less than Mountain Brook. Here’s one more chart I pulled showing how we stack up against nearby districts.So we may not need to raise property taxes to get better schools, which is good, because it’ll probably be a cold day in hell before that happens here. And we can stop scapegoating race and poverty, because districts like Chicago do just fine. All of which is good news. It gets us out of the mindset of things we can’t fix and put us into a place to consider the things we can fix. Which is good, because MPS is in serious trouble.
But, there’s still plenty of bad news here. Mostly the part where our students are only getting three years worth of education out of five years of seat time. That’s positively criminal, and it’s something we need to take very seriously. Those years in grades 3-8 sort you out for success or failure in high school, where tracking makes it difficult for students identified as underperforming to break out of the molds we put them into. These are critical times, and MPS is failing at its core duty. We need to be having serious and data-driven conversations about what it means to get students at grade level, and how to increase proven practices like teacher collaboration and principal autonomy. Otherwise we’re failing our children – and worse, our community.
Discussion is welcome in the comments section.