Initial Thoughts on MIT Report on Proposed Assignment Plans


Email, Print, send to Twitter, send to Facebook, and more

MIT has now released a report with a tremendous amount of detail on their simulation of the three proposed student assignment plans. There are three files. The main report is 51 pages. There is also a 27 page technical appendix and a 137 page graphical appendix. So to say there's a lot to digest here would be an understatement.

I've read through the main report and skimmed the other two. I doubt that many people will find the technical appendix useful as it is indeed quite technical. The main report and graphical appendix are more accessible, though still fairly complex. In particular, I would recommend to anyone reading the report that you read as much of the detail on what each measurement means and how it was calculated. A quick read could definitely leave you misinterpreting some of the data.

One thing that's important to know is that most of the measurements in the report are based on round 1 of 2012's K2 assignments. MIT looked at what would have happened last year if each of the new plans had been in effect. Kids who entered in later rounds are not included in the numbers. Also, the numbers for the current plan are also based on their simulation of 2012's K2 round 1 to create an apples-to-apples comparison. However, there is one difference between their simulations of the current and proposed plan. For the current plan, they use the current method of applying walk zone priority first. For the new plans they use the BPS proposal to change the assignment order to what they call a "compromise plan." It does not appear that this change has a large effect, but it probably provides some additional advantage to walk zone students.

One of the easiest things to look at is distance to school. This is clearly improved in all of the plans. In addition, the range of distances is much smaller in these new plans when looking at the 25th and 75th percentiles. That means that there won't be as much variance in distance traveled as there is now, but there will still be outliers who have to travel over five miles to school. The differences between the three proposed plans is small with a median distance of 1.12 to 1.19 miles compared to 1.87 for the current plan.

They also looked at bus coverage area. This is the geographic area around a school but outside of its walk zone where students could apply to that school and could potentially need to be bused. This is also significantly reduced for all plans, ranging from an average of 6.9 to 8.6 square miles when the ELL overlay is included. This compares to 24.5 square miles under the current plan.

Moving on to equity, there are several ways to look at this. MIT is using a measurement called Effective Access to Quality. They are using BPS's quality measurement which says that the top 50% of schools are quality schools, based on MCAS scores. They're defining access to quality as the chance a child has of getting into one of these schools, if the child chooses quality schools. This is an important distinction because not all parents are choosing quality schools based on this definition. In fact, MIT found that about 57% of of families are currently choosing a "quality" school as their first choice.

This is a reasonable way to look at this since it really doesn't make sense to say that a student who has quality schools on his or her list of choices, but chooses other schools first doesn't have access to quality schools. The numbers that BPS has already released on this show the access to quality for the child with the lowest chance of getting a quality school. In the current plan, this is 19.5% in round 1. Remember, that's not the typical student, it's the one with the worst chances. The new plans would increase this to 22.6% for 10 zone, 22.4% for Home-Based A, and 25.5% for Home-Based B. These aren't huge, but they are significant differences. The report also has a char that shows the full range of access and the 25th and 75th percentiles. I think these are more useful than looking at the worst case, as that could be an outlier. The 10 zone plan has a significantly higher range of access between the 25th and 75th percentiles than the current plan which means that by this measure it's less equitable. Home-Based plan A has a slightly smaller range and plan B has a still smaller range. Since Home-Based B also has the highest chance at the worst case I feel that it is the most equitable of the plans.

The report also looks at how well the plans perform as far as giving families access to their top choices. This can be thought of as a different way of looking at access to quality. Instead of using the BPS measure of quality, this uses parents' perception of quality as they choose schools. In some respects, this may be a better measurement as we know that the MCAS measurement is very limited and that each parent has his or her own definition of quality. On the other hand, some parents are inevitably making more informed choices than others, so this is still not a perfect measurement.

MIT looked at access to top choices in two ways. One is to look at access to the top menu choices. This is pretty straightforward. Knowing what each families available schools are and using the demand model to approximate how they will rank those choices, what are a family's chances of getting into one of their top three choices? The problem with this measure is that it doesn't consider whether the schools a family wants are even on the choice list in the first place. In fact, by simply reducing the number of choices you are almost guaranteed to increase access to families' top menu choices (once you get down to three choices per family, you will reach 100%). So MIT also looked at access to what they call a family's "Top Dream Choices." This is defined as the top schools the family would choose if they had access to all BPS schools. They then look at what chance the family would have of getting one of these schools in the various plans. This measure penalizes a plan that doesn't offer families the schools they want or lets them choose these schools but gives them little chance of getting in.

Under the Top Dream Choices measurement all of the new plans perform worse than the current plan. This is not surprising as they all offer families significantly fewer choices. In addition, the current plan is more equitable as it has a smaller range of access for different students under this measurement. There is very little difference among the three proposed plans under this measurement.

Consider this Part I of a series of posts on this report. There's a lot more here that I will try to cover in the next few days. The report contains maps to give an idea of geographic equity and break-downs based on race, ethnicity, and socioeconomics. Suffice to say that this is all quite complicated and will require careful consideration as we all try to understand how any of these new plans would affect our public schools and our city.


Skip Comments


I had an EMail exchange with Peng himself last night about something I noticed with this report (he was very nice and quick to respond BTW), and his answer has had my mind racing all morning about how I think using this report to evaluate the "Home" plans over the long term is a misleading way of using this data. I know that you put a disclaimer to this affect in your post here, but I think the possibility that the effects on an area when you think about 2 or 3 years in the future rather then just 2012 K2 could be 180 degrees different is a BIG deal.

Ever since I used your tool to determine that under "Home" A Kilmer becomes a de-facto neighborhood school of a section of West Roxbury I have been interested in how schools being blocked by other schools from major population centers would effect what would happen under these plans. With this in mind I looked at this report last night and noticed that the dots for that section of West Roxbury show a 50% chance of getting into a top 50% school. This did not jibe with what I would think would happen since kids in West Roxbury get exclusive access to Kilmer (I think the top MCAS school in the whole city), and have a whole host of other tier 1 and 2 schools on their list.

I asked Peng about this and he said that the reason why their access numbers are so low for the Kilmer area is that they are looking at K2 for last year, and since Kilmer starts at K1 it is full, and thus unavailable for people in that area who would be starting at K2 today. He said this happens with other top schools that start at K1 in his analysis as well.

So it does make sense to me why his report says that for a new K2 kid in West Roxbury next year access to top 50% schools is not that high, so in that way this report seems very reasonable. However, if you think for kids starting at K1 next year, and every year from now into the future, the access to a top 50% school is going to be very high. After all, remember that they have exclusive access to arguably the best elementary school in the city, and especially after grandfathered siblings work their way out of the system Kilmer is going to be 100% kids from West Roxbury, after all, it is the only place that has it on their list.

I think the big deal here, is that when I look at a document like this letter to the EAC (…) it has some caveats about the limitations of this analysis, but at the end it specifically says that access to quality for West Roxbury is unchanged. I do not dispute that this is true for the handful of new K2 kids entering the system next year, but that is absolutely not true for kids entering K1 next year, and in every future year from next year forward.

With this in mind I view this letter as highly misleading, not because Peng's model is wrong, but it is taking a model that applies to a very specific situation and making an unqualified statement about what happens to West Roxbury. What they are saying is true for maybe 5% of all the kids in West Roxbury effected for the next 20 years (the K2 kids next year), but the experience for the other 95% of kids is totally different.

Now I want neighborhood schools (with other methods to promote equal access to quality seats), so in of itself I am not against West Roxbury getting its own high quality neighborhood school. However, "Home" A completely removes Kilmer from everyone else in the city, which even I would not do, and I think it is very unfair that West Roxbury gets is own neighborhood school under this plan while the rest of us do not get this benefit. Also, Kilmer is the only one that I have the time to figure out, but if this is true there then are we sure that there are not a host of other de-facto neighborhood schools created by this plan on the edges of town? Also, the kids who have the least access to good schools now are often located in the center of the city and because of blocking in this plan they could all be given access to the same handful of quality schools in the middle of town, while being cut off from the de-facto neighborhood schools on the perimeter. Which I thought is exactly what pro-lotto reformers don't want to do.

However, the thing that concerns me most about this revelation is that it indicates to me that we have taken an analysis that was intended to model a very specific time frame and access point and are using it to make decisions with how to reform a system with tens of thousands of kids and millions of dollars. The fact that it took me less then 30 minutes to figure out an instance where the output of this model is totally different then the long term effects one can reasonably assume will occur makes me very concerned how many other effects of these complex systems is not captured, or could not be captured by this type of modeling process.

In short I think the modeling of the 10 zone plan is probably reasonable since that system is well understood, but I think that if you are using the output of this model to argue about the impacts of the "Home" systems over the long term, you are using the wrong tool for the job.

Josh Weiss's picture

Josh Weiss's picture


This is definitely worth looking into further. I'd be interested in knowing if Peng thinks that this would affect the overall findings. It clearly does effect the neighborhood comparisons. I took a quick look and it appears that most schools have more K2 seats than K1, but several do have about the same.

I know that the full report also has some stats on K1 assignment which might be helpful. I think what would give the clearest picture would be to look at two years and assume that families that don't get an acceptable K1 seat apply again in K2, then look at the overall results of the one or two attempts at the lottery for each family. I think that's how this actually plays out in practice, most families apply for K1 then apply again for K2 if they where a unassigned or didn't get an acceptable seat.

So I have been talking with Peng more and he did have a good point that there are a lot more K1 kids in West Roxbury then seats in Kilmer, so even though Kilmer belongs to West Roxbury it does not mean every kid has good access to it. However, that got me looking at this a little deeper.

First enter this address in your tool: 162 Perham Street, West Roxbury, MA (Picked at Random)

The schools on the list of possibilities for this address for Home A are the following, with their enrollments from the state DOE website (which are a little off because of general ed vs special ed, but its what I have):

Lydon - 44 K1 - 62 K2
Kilmer - 52 K1 - 44 K2
Beetohven - 44 K1 - 76 K2
Mozart - 22 K1 - 20 K2
Bates - 25 K1 - 44 K2
Sumner - 41 K1 - 86 K2
Greenwood - 0 K1 - 58 K2

All of these schools are tier 1 and 2 with the exception of Greenwood, which is 4. If we total up these numbers we find that for K1, if this student gets a seat, they have a 100% chance of being in a top 50% school. For K2 if they are in a seat in one of these schools there is a 85% chance that it is top 50%, assuming the lotto is equal to all of them. I know that the lotto is not actually equal because of higher demand at good schools, but factor in a walk zone if it exists, which I am not accounting for here, the fact that some schools like Kilmer are isolated and not available to anyone outside West Roxbury and I think that it may even out. Also, there are only 58 out of 390 K2 seats on this list that are tier 4, so even if every kid at Greenwood is from West Roxbury, that is still only a small percentage of the total seats.

Now lets look at another address: 37 Holiday Street, Dorchester, MA

Henderson (tier 1) - 36 K1 - 47 K2
Murphy (1) - 43 K1 - 71 K2
Mather (2) - 65 K1 - 83 K2
Greenwood (2) - 50 K1 - 45 K2
Lee (3) - 73 K1 - 46 K2
Everett (3) - 15 K1 - 42 K2
King (4) - 41 K1 - 47 K2
Holland (4) - 48 K1 - 121 K2
Marshall (4) - 111 K1 - 109 K2
Holmes (4) - 27 K1 - 46 K2

Of the 509 K1 seats that a child at this address is trying to lotto into 194 of them are tier 1 or 2 or 38%. For K2 the numbers are 246 out of 657 or 37%. Keep in mind also that since their 1 and 2 options are not geographically isolated the are likely available to more people.

Now Peng's model mutes the effect of this because since it is focused on K2 kids entering the system next year a lot of the seats in West Roxbury are full from K1, and there are grandfathered siblings from other the parts of the city. However, I would argue that over the long term these numbers are going to be more importent as the last grandfathered siblings work their way through the system. After all, in the end of the day if the plan is that these are the schools that people have to go to and for some people the vast majority of the seats on their list are tier 1 and 2, they must have a good chance of getting one of those seats.

So we are talking 100% vs 38% and 85% vs 37%.

Josh, you probably have all the data to make a map with this calculation for each geocode. If you feel like sharing your data file with me I could probably put it together, maybe I am wrong but I think it is going to look a lot like the current school quality maps, but maybe even more extreme.

Ian, I think in general your interpretation of the facts is right. And it's probably true that the very brief summary discussion in the memo of the effects is somewhat lacking.

However, I have a different view about the conclusion. The Home Based models are pretty generic. I think their flexibility extends to K1, the way they work with K2 will be similar in general. They are not perfect for anyone, but they do a pretty good job of balancing the tradeoffs.

In other words, if we were still looking at zone based models, I would want detailed analysis, as once they are done, they are cast in stone. But the Home Based models are likely to give pretty good outcomes in a variety of situations. And considering the geography of the city and the schools, that's the best we can do right now. (And a lot better than where we were headed six months ago.)

And the Home Based models can adapt over time, to be even better.

Thanks Josh for this very readable summary of the report!

One small clarification in definition of access to quality: A student's effective access to quality is his/her maximum chance of getting into a quality school that he/she finds acceptable. In the report we estimate "acceptable" as meaning being in his/her top 10 choices. (Since we assume everyone ranks up to 10 choices, that implicitly assumes the 11th choice is "unacceptable.") This clause of "that he/she finds acceptable" was not in Josh's summary but I want to emphasize it because it is an important clause.

Another way to think about this is: suppose the student were to reshuffle his/her top 10 choice lists so that he/she is ranking based purely on MCAS, what is his/her chance of getting into a top 50% MCAS school?

The reason we need the "acceptable" clause is that suppose there's a easy-to-get-in school that has good MCAS but is say very far away and will only make it to say the family's 20th choice, it would be kind of unfair to say the family has high access to quality because of this school because one would think that the family would not even go if assigned. It is not sufficient to provide "access to quality," we need to provide "access to quality" that the family can plausibly accept.

While the cut-off of 10 is somewhat arbitrary, as for some families perhaps anything beyond their 3rd choice they won't accept and will just go to a private school or METCO, but for other families perhaps a 20th choice is still acceptable. However, since we do not model outside options, we cannot capture the variation in "acceptability levels," so we simply use 10 as a reasonable "acceptability cutoff." This reasonably matches the overall level of competition. For experiments with other cutoffs, see the technical appendix.

Josh Weiss's picture

Josh Weiss's picture

Thanks, Peng, that's a good point. I was glad to see how many issues like this you had considered.

I was really interested to read about the different definitions of quality in play here. My gut feeling about these proposals has been that I'd probably have a slightly better chance at getting a school that I'd be OK with, but would completely eliminate my access to the schools that come closest to meeting all of the criteria that I used to determine my top choices. And that's pretty much exactly what the two "top choices" models show as a general trend across the city, if I'm reading this summary correctly.

Personally, I don't think I'd feel as good about my choices under this new plan as I did under the old plan, but I'm saying that with the benefit of hindsight and as a lottery "winner" who received a slot at a fairly highly chosen school that is outside of my walk zone. It's easy for me to look back at the lottery and say how great it was that I had access to the schools I did. At the same time, I suppose that for the families who fared worse than I did it's just as easy to say that having access to more schools isn't worth a whole lot if your chances of getting a high demand seat are so slim because of that broad access. It's really hard to judge the trade-offs here, and I don't envy the EAC the task.

Add new comment

Back to Top