« April 2010 | Main | June 2010 »

3 posts from May 2010

May 20, 2010

Brill-ing Down: Adding to Steve Brill’s NYT Magazine Report

UPDATE: Thanks to Aaron Pallas for pointing out an error. The enrollment for Harlem Success Academy in 2008-2009 was 398 students. The enrollment was 276 students in 2007-2008.

Steve Brill’s latest article chronicling the politics of the Race to the Top competition is already making waves. One contentious aspect of the piece is Brill’s comparison of two schools that share the same building: Harlem Success Academy and P.S. 149. We thought it would be helpful to augment Brill’s commentary with additional data. The first table has 2008-2009 demographic data, and the next two have 2008-2009 3rd grade state test scores. (Harlem Success only had this one testable grade.) Information was culled from NY State Accountability Report Cards as well as special education invoices provided to the UFT by the New York State Education Department.

According to this data, Harlem Success Academy does appear to serve less needy students, both in terms of economic status, limited English proficiency, and special education needs.  On the other hand, Harlem Success dramatically outperforms P.S. 149 on 3rd grade test results. 




*There were not enough Limited English Proficient students tested in 3rd grade at P.S. 149 to report a score. There were no Limited English Proficient students tested in 3rd grade at Harlem Success.

May 11, 2010

Closing the Gap: Charter School Special Education Statistics

Last week, the New York State Senate passed a bill that would increase the number of charter schools in New York from 200 to 460. Included in the bill was a provision that charter schools increase efforts to enroll students with learning disabilities — an attempt to appease critics who claim that charters significantly under-enroll students with disabilities.

Yet an examination of data provided to me by the city shows that while charters enroll fewer students with disabilities, the gap is not as large as initially reported by the state teachers union, known as NYSUT. According to the Department of Education data, 13 percent of charter school students have Individualized Education Plans, indicating that they have special needs, compared to 15 percent at traditional public schools. NYSUT reported the numbers as being 9.4 percent at charter schools and 16.4 percent at district schools.

The discrepancy stems from problematic data NYSUT received from the state education department (NYSED). According to NYSED, the number of students with disabilities that a charter school reports enrolling often does not match up with numbers reported by school districts.  As a result, NYSED does not consider their own data to be reliable.

As an alternative, I used a database known as CAPS, which is compiled by the Committee on Special Education.  CAPS includes information about every student in the city who has an IEP, so it provides a more accurate breakdown of the number of special education students at each school. 

I found that the percentage of charter schools enrolling students with disabilities at comparable or greater rates of their traditional public school counterparts increased by 5 percent, from a quarter of schools last year to almost a third of schools this year. Meanwhile, charter high schools actually enroll a larger percentage of students with IEPs than traditional public high schools both within their geographical districts and citywide. Of course, there are only a small number of charter high schools.

I also noticed that the 54 charter schools that are authorized by the DOE enroll, on average, 4 percent more students with IEPs than the 37 charters that are authorized by SUNY's Charter Schools Institute. (City-authorized charters have special education populations of around 14 percent, versus 10 percent for the SUNY schools.) The New York State Education Department authorized just seven charter schools in 2009, which enrolled special education students at about the same level as the city-authorized schools.

A full breakdown of the data is available at the end of this post. To see my calculations as well as special education enrollment by school, you can check out this spreadsheet.

Notes on Methodology:
1. The number of students in need of IEPs can fluctuate as the year progresses. I used the end of the school year data, since the DOE classifies this as the "high-water mark" for special education numbers.

2. To compare charter schools to traditional public schools, I looked at the average IEP rates of traditional public schools in the same district as the charter school, divided by year. So, for example, if a charter school enrolled students in Kindergarten through fifth grade in District 5, I looked at the average IEP rate of kindergartners through fifth-graders in traditional public schools in that district. I did this because IEP rates vary significantly with grade year, with lower rates in the early years (K-2), the highest rates in middle school, and average rates in high school.

3. Since the number of students requiring special education services has increased dramatically over the past five years (see chart below), there has been a significant increase in the percentages of students who need services at both the charter and district schools. Thus, if you look at the data for two consecutive years, you will notice significant increase at both the charter and district level—this is not a data error.


4. Unfortunately, my data did not differentiate among the kinds of services required by students with IEPs, so I cannot make any inferences about the level of need among special education students at the two kinds of schools. Because of this, I did not include data from District 75 schools, which tend to serve the most severely disabled students.

As always, I welcome your feedback and suggestions in the comments section.

May 04, 2010

April Research Round-Up

This is the second in our series examining the most interesting, most discussed academic research on education published each month. We’re publishing it a little early since May 1st was a Saturday this month. As always, let us know if we forgot anything – or if you have any researchers that you’d like us to add to our list!

Study: Evaluation of the Comprehensive School Reform Program Implementation and Outcomes: Fifth-Year Report Source: Daniel K. Aladjem, Beatrice F. Birman, Martin Orland, Jennifer J. Harr-Robins, Alberto Heredia, Thomas B. Parrish, Stephen J. Ruffini, WestEd Research Report

Results: The Comprehensive School Reform program was designed to improve high poverty, low-achieving schools by instituting scientifically-proven comprehensive reforms. A new report by WestEd researchers finds that of the 7,000 schools that received funding, only a third "selected reform models identified as having a scientific research basis". Overall, the program was "not associated with widespread achievement gains". Indeed, out of the 1,037 CSR elementary schools that were low performing, there were only 47 that showed “dramatic and sustained” achievement gains.

What’s Interesting: Of the schools that did make achievement gains, there was no one strategy that all of the schools had in common that seemed to lead to achievement gains.

Caveat: There was very little cohort data available before 2004-2005, so it was impossible to track cohorts of students. Differences in state standards for testing made it difficult to compare students nationally.

Study: Financial Incentives and Student Achievement: Evidence from Randomized Trials Source: Roland G. Fryer, National Bureau of Economics Research Working Paper

Results: Roland Fryer’s experiment to incentive students via monetary rewards did not lead to measurable increase in student achievement in New York City or Chicago, where students were paid for good test scores or grades in core courses. More effective were two smaller experiments that rewarded students for so-called "inputs" (as opposed to "outcomes" like test scores). These "input" experiments, which compensated students for the number of books read and for attendance and homework, had statistically significant effects on student achievement.

What’s Interesting: The Dallas program—which paid second graders $3 for each book read—had the strongest effect on increasing reading comprehension and vocabulary and was the least expensive program. The effect of this intervention did not lead to negative results once the incentive was taken away—rather, there were continued, albeit smaller, gains in achievement after the program ended.

Caveat: White and Asian students did not experience gains, although the sample size was small, and students not eligible for free lunch improved more than students who were eligible. The gains measured once the program ended only looked at one year of data.

Study: Teacher Quality in Educational Production: Tracking, Decay, and Student Achievement Source: Jesse Rothsein, The Quarterly Journal of Economics

Results: This paper looks at some of the assumptions behind so-called “Value-Added” models of teacher quality ("VAMs") and finds that "the assumptions underlying common VAMs are substantially incorrect, at least in North Carolina". Classroom assignments are not sufficiently random to allow for a causal conclusion in common models.

What’s Interesting: Rothstein makes two main points: (1)Accountability policies that rely on measures of short-term value added would do a poor job of rewarding the teachers who are best for students’ longer-run outcomes; and (2) Models that rely on incorrect assumptions are likely to yield misleading estimates, and policies that use these estimates may reward and punish teachers for the students they are assigned as much as for their actual effectiveness in the classroom.

Caveat: Only looked at data from students in North Carolina.

Study: A Closer Look at Charter Schools and Segregation Source: Gary Ritter, Nathan Jensen, Brian Kisida, Joshua McGee, EducationNext

Results: A report issued by Education Next takes issue with the January 2010 report by the Civil Rights Project which found high levels of racial segregation in charter schools as compared to traditional public schools. The Education Next report argues that by comparing the demographic composition of all charter schools to that of all traditional public schools, the CRP’s report ignored significant differences in neighborhood demographics within school districts. In order to correct for this, Education Next compares the levels of segregation for students in charter schools to that of their TPS counterparts in the central city districts of the 8 largest metropolitan areas in the CRP report. The authors found that the segregation gap narrows from the 20% reported in the CRP report to 10%.

What’s Interesting: Authors demonstrate that the majority of students in central cities, in both charters and TPS, attend highly segregated minority schools.

Caveats: Authors caution that their methodology could be improved one step further, by comparing the demographics of charter schools with the traditional public schools that charter students would have attended had they gone to their local zoned school.