October 01, 2010

Progress Report Redux, 2009-2010

Yesterday, the city released its annual Progress Reports. Mirroring the trend with test scores, charter school progress reports suffered more than their public school counterparts, receiving a higher proportion of C and D grades and an overall lower average Progress Report score. A full breakdown of charter school performance compared to last year’s Progress Reports can be found in the spreadsheet available here.

Some observations:

1. Like their public school counterparts, several charter schools benefitted from the DOE’s decision to implement a floor to the grading process. This floor meant that a school’s overall Progress Report grade could not go down by further than two letter grades—making the lowest grade for an A school last year, when 84% of all public schools and 73% of charter schools receive the highest mark, a C. The charter schools in particular who benefitted from this floor were: (with scores, given Progress Report grade, and overall school rank noted)

a. Future Leaders Institute Charter School: Score, 12.3; Grade: D (received B last year), Rank: 2%

b. The UFT Charter School: Score, 13; Grade: D (received B last year), Rank: 2%

c. Harlem Day Charter School: Score, 10.2; Grade: D (received B last year), Rank: 2%

d. Ross Global Academy Charter School: Score, 0.1; Grade: C (received A last year), Rank: 0%

e. Sisulu-Walker Charter School: Score, 11.2; Grade: C (received A last year), Rank: 1%

f. Bronx Charter School for Children: Score, 15.3; Grade: C (received A last year), Rank: 3%

g. Merrick Academy Charter School: Score, 13.9; Grade: C (received A last year), Rank: 3%

2. Many of the charter schools that performed poorly in 2008-2009 again performed poorly in 2009-2010. The exception is the South Bronx Charter School for International Cultures and the Arts, which went from being in the 20th percentile to the 76th percentile – the biggest gain of any school. The school that dropped the most was New Heights Academy Charter School, which went from being in the 98th percentile of all schools to the 17th percentile.

3. Merrick Academy and Girls Prep – two schools that were plagued with problems with staff and space, respectively—experienced large drops in their overall Progress Report Scores and percentile rankings. Merrick Academy’s overall score dropped by over 80 points and its percentile ranking fell from 76% of all schools to the bottom 3%. Girls Prep Charter School’s score dropped by 70 points and its rank dropped from the 82nd percentile to the 13th.

4. The UFT Charter School remained in the bottom 5% of all schools citywide, and became only the second charter school to remain in the bottom 15% of all schools for citywide for three years in a row. Only one other school, Peninsula Preparatory Academy, has remained in the bottom 15%, although this year it improved and is now in the 35th percentile of all schools citywide.

5. The best performing charter school was Democracy Prep Charter School, which received a score of 88.9 (compared to 99.8 last year) and was ranked in the top 1% of all schools citywide. Other charter schools that were in the top 5% of schools citywide were: Williamsburg Collegiate Charter school, KIPP Infinity Charter School, and Brooklyn Excelsior Charter School. A full list of all charter schools that received Progress Report grades, as well as their scores, overall percentile rank, and performance last year, is available below.

As always, I welcome your comments and suggestions for how to improve this report!

August 03, 2010

Charter School 2009-2010 Test Scores

As discussed here and here, the State released the results of the 2009-2010 Grade 3 through 8 Math and ELA test results last week. The focus has been on the new, higher bar for passing the tests and the resulting large drop in the percentage of students judged proficient. Charter schools, like traditional public schools across the city, saw their much-touted proficiency gains plummet. Barbara Martinez at the Wall Street Journal did a good job of summarizing charter schools’ results in New York City. In order to give a more complete picture, I analyzed the 2009-2010 results for charters to see which schools performed best and how the schools performed compared to their traditional public school counterparts. I also posted data on individual schools below and in this spreadsheet.

I defined proficiency in the customary way: as the proportion of students at a charter school that scored a Level 3 or higher on the ELA or Math tests. In order to look at overall school performance, I averaged the proficiency rate across grade levels broken down by subject, and then took the average of both the ELA and Math tests to come up with a single “proficiency” number. The schools that had the highest average proficiency rates were Harlem Success Academy, Icahn Charter School 2, the Bronx Charter School for Excellence, and the Williamsburg Collegiate Charter School. (The other two Icahn Schools also scored in the top ten of all charter schools.) To be clear, different schools serve different grades and comparing performance across grades can be misleading.

I’ve posted a chart below that lists the average proficiency rates as well as the ELA and math proficiency rates, for every charter school that posted test results during the 2009-2010 school year. Scroll over the name of the school to find out what grades the school services, which grades were tested, and other salient information relating to the school’s performance.

I also compared average charter school performance to average traditional public school performance in neighborhoods in which there is a large cluster of charter schools. This gives a rough sense of how charters compare to traditional public schools with somewhat similar demographics. The neighborhoods I chose were the same that the UFT looked at in their analysis of charter schools: the South Bronx, Central Brooklyn, and Harlem.

Using this simple metric, I found that charters in the South Bronx, Harlem, and Central Brooklyn performed much better than their traditional public school counterparts. Average charter proficiency in all three neighborhoods was about 50% compared to 35% in traditional public schools. Charters performed significantly better in math than traditional public schools, which mirrors the trend citywide. Charter schools located in the South Bronx in particular had a proficiency rate that was around 25 points higher than that of traditional public schools in the same neighborhood. The chart below summarizes my findings.

My analysis of charter school test performance did not take into account demographics, such as the proportion of ELL or special education students or the number of students who are eligible for free or reduced priced lunch. These factors, of course, can make a significant difference in test performance.

As always, I welcome your feedback for ways that I can improve this analysis, as well as other methods that I could use to make the data more understandable.

July 27, 2010

Gifted and Talented, Especially in District 2

Using Gifted and Talented test data posted by InsideSchools, I created a map that shows the concentrations of aspiring G&T Kindergartners in New York City. The numbers next to each dot represent the district in which the student lives. Scroll over the dots to get the percentage breakdown of the number of students that qualified for district G&T programs, citywide G&T programs, and who scored in the 99th percentile on the test. The gray layers under the map correspond to the median household income of those zip codes. No surprise here that the largest number of G&T test takers resided in the most wealthy districts - but districts 15 and 30 seemed to also do well. As always, I would love your feedback on how to improve this map.

July 20, 2010

Follow the Money: UFT’s Political Fundraising Highest in Ten Years

In a recent article in the journal EducationNext, Mike Antonucci  reviewed the finances of the two largest teachers unions, the National Education Association (NEA) and the American Federation of Teachers (AFT). He found teachers unions in states like Oregon, Colorado, and Montana spent several hundreds of dollars per teacher for political campaign spending on candidates and ballot initiatives. New York, according to Antonucci, spent only $5 per teacher.

But this is only part of the picture. Another source of political spending can be found in financial documents that the United Federation of Teachers (UFT) filed with the federal government.  According to this "LM-2" filing, the UFT spent around $31 per teacher, or a little over $2.4million overall, out of a $202 million budget, on political activities during the 2008-2009 school year.* The UFT membership, however, consists of more than just teachers. If you included total UFT membership—164,462—spending on political activities would be around $16 per member.  (To be clear, Antonucci only considered active teachers in his calculations.)

In addition to this spending, which includes things like lobbying, buses to events, and phone banks, the UFT has a political action committee (PAC). The PAC is a stand-alone group whose specific purpose is to dole out money to politicians, groups, and ballot measures that the union supports. The UFT's PAC, known as the Committee on Political Education ("COPE"), is funded by voluntary member contributions as well as other sources.

COPE spent $187,411 in 2008-2009 on donations to politicians. The fund’s balance—that is the amount that can theoretically be given away—has also dramatically increased, to $1.35 million in July 2009, from an average of $124,000 during 2000-2005.  Furthermore, contributions to the COPE—that is, the amount that members decide they would like to contribute to the union’s political activities—have reached their highest level in ten years. In contrast, the amount the UFT spent on political activities independent of COPE has remained relatively constant at around two and a half million dollars annually.

Much of the increase in political giving could be due to the pressure that teachers unions around the country have faced. According to an article in the Wall Street Journal today, charter supporters outspent the UFT during by about $100,000 during this same period. Last year spending could have been particularly high due to the mayoral race, although the increase in COPE’s political spending pre-dated the race by several years.

Although the dues for UFT members vary by job type, teachers are required to contribute $47.27 of their paycheck semi-monthly to the UFT. This means that around 3% of a typical teachers UFT dues is spent on political activity. UFT members can also elect to send part of their paycheck to COPE. According to the UFT handbook, this voluntary contribution is usually around $5 per paycheck.

The UFT’s LM-2 also lists the amount of time that UFT employees spend on various activities. My analyses found that 61 of the 623 paid UFT employees, or around 10%, spent more than a quarter of their time on political activities. Overall, around 7% of all UFT employee activities are devoted to political lobbying.

The majority of the UFT’s funds were spent on benefits for members, consultants, and lawyers. However, the UFT, like the AFT and the NEA, also spends a significant amount of its funds supporting liberal causes. The biggest donations were to groups like ACORN and the National Action Network. The UFT also contributed small sums to a wide number of community organizations and to a number of religious, political and ethnic organizations like the American Friends of the Yitzhak Rabin Center, Empire State Pride Agenda, Inc. and the Hispanic Federation. (A full breakdown of the UFT’s contributions is available below and in this spreadsheet.)

Salary-wise, 85, or 13%, of the 624 UFT employees made over $100,000, with the highest salary paid in 2008-2009 being $228,705. The average UFT salary was $51,215, however, since many members work part-time, the numbers may be somewhat distorted.

As always, I welcome your feedback and questions and encourage any UFT members to share their understanding of the UFT’s finances in the comments.

*I arrived at the $31/teacher figure by using numbers provided to me by the DOE and contained within the UFT filing: According to the UFT’s filing, the union spent $2,404,820 on political activities during the 2009 fiscal year. Documents provided to me by the DOE stated that the active teaching force as of January 2009 was 78,728 teachers.

July 08, 2010

Discipline Data: Suspensions at Charter Schools & Traditional Public Schools

On Tuesday, the Daily News published a report on the rising rate of student suspensions in New York City’s schools. Since charter schools in New York often have discipline policies that differ from their traditional public school counterparts, I was curious to compare suspension rates in charters to those in traditional public schools. Looking at the Basic Education Data System (BEDS) filings for both charter schools and traditional public schools during the 2008-2009 school year, I found that both types of schools suspended, on average, around 8% of their student body. (BEDS data asks schools only to report on the number of students that were suspended, not the number of overall suspensions, which is the number that the Daily News article cited.)

Since school demographics can be correlated with suspension rates, I looked at charter school suspension rates as they compared to their traditional public school counterparts. I found that the results varied by neighborhood. In Harlem and the South Bronx, charter schools suspended a lower percentage of their student body.  In Central Brooklyn, charter schools suspended slightly more students. A breakdown of suspension rates at co-located charter schools is available in this spreadsheet.

Overall, suspension rates among charter schools varied widely, with some suspending no students and others suspending close to 40% of their student body. Out of the 77 schools that were open in 2008-2009, 18 suspended no students and 21 suspended 10% or more of their student body. However, as with all BEDS data, these numbers are self-reported by schools and thus could be unreliable. (This is the same as the traditional public school data, although there the data includes more severe superintendent’s suspensions, which are corroborated by a second person outside of the school itself.)

I did not disaggregate the data by type of school—that is, elementary, middle, or high school—since I didn’t see any noticeable difference in suspension rates between middle and high schools. (Elementary schools had slightly lower rates.) Furthermore, this data only looks at suspensions, not expulsions—a key difference, since expelling a student might be easier at a charter school, where the board only has to approve a principal’s recommendation. At a traditional public school, a student cannot be expelled unless he/she is 17 years of age. If the student is younger than 17, the most severe form of discipline allowed is an extended suspension for a year or an involuntary transfer, both of which can only be given with the consent of the regional superintendent and/or the Director of Suspensions.

For more on traditional public school discipline policies, see this primer. For a sample of charter school discipline policies, see this folder. (N.B. I plan on updating this folder after I look at more charter applications next week.)

As always, I welcome your feedback on ways to improve this data, as well as other questions you might have.

June 30, 2010

MAP UPDATE: Closed Schools v. Charter Schools Since 2003

Last week I posted a map of the schools that have been phased out by the DOE since 2003. (Note: not all of these schools were initially proposed for closure under Mayor Bloomberg.) After your feedback, I added some new information: the charter schools that have been opened since 2003 (in blue), new schools that have been opened since 2003 (in green), and economic data about the neighborhoods in which these schools are located. As always, I'd love your feedback as I move forward with this project!

CMOs Need Philanthropy, TFA, According to Report

The National Study on Charter Management Organization (CMO) Effectiveness: Report on Interim Findings

A recent national study on Charter Management Organizations, or CMOs, by non-partisan Mathematica Policy Research, sheds some light on the role that these organizations play in the national educational landscape. According to my own measures, CMOs ran 37 of the 77 charter schools in New York City during the 2008-2009 school year – and they have plans to open dozens more in the next decade. While CMOs attract large amounts of philanthropic support, anti-charter critics charge that they are opaque and run their schools more like for-profit institutions. This interim report offers fodder for both supporters and detractors. I found five points to be particularly interesting:

  • CMOs need philanthropy to exist: All 44 CMOs in the study relied on philanthropic dollars to support operations. The average CMO relied on philanthropy for 13% of total operating revenues. CMOs funded by NewSchools Venture Fund report that 64 percent of their central office revenues come from philanthropy. The report concludes: “At least for now, these CMOs are unable to support their central offices (which often comprise 20 percent or more of total CMO spending) and facilities costs on per pupil revenues alone.”

  • CMOs rely on alternate certification programs, like TFA, for talent: According to the report, almost 20% of teachers at CMO schools come from alternative certification programs like TFA. In addition, many of the people in managerial and leadership positions are TFA alumni. CMOs claim that teachers trained in the TFA mode are accustomed to longer hours and “No Excuses” approaches and therefore require less training in the culture of the CMO. The authors question the ability of CMOs to expand if they rely so heavily on one source of talent.

  • CMOs have had problems expanding to high schools: Across the country, CMOs operate a disproportionate number of elementary and middle schools. CMO leaders say that expanding to high schools has proven difficult, both because the students arrive with an array of problems that the schools are ill-equipped to deal with and because "a highly prescriptive education model that works for middle school students may become a liability in high school... [S]tudents accustomed to highly structured courses can become too dependent on their instructors. If that happens, these students are unlikely to acquire the skills needed to navigate the more independent educational environment they will encounter in college.”
  • CMO growth alone will not be able to improve entire school districts: Although CMOs have expanded rapidly, the recent pace of new CMO creation has slowed dramatically. The report notes that Arne Duncan’s goal of turning around the lowest performing 5,000 schools by 2014 can’t be reached by the current CMO crop alone, since these CMOs only plan on opening 336 new schools in that timeframe. Furthermore, the report points out that expanding often puts CMOs into shaky financial terrain: “Expansion may not equal sustainability: According to our survey, CMOs with two to six schools draw an average of 9.6% of their operating budgets from private funds. That proportion increases to an average of 14% for CMOs with seven to ten schools, and to 16.3% for CMOs with more than ten schools.”
  • The majority of CMOs are clustered in five states and a handful of cities in those states: More than 80% of CMOs are clustered in a handful of places: California, Texas, Arizona, Ohio, Illinois, New York, and the District of Columbia. They make up a large share of big charter markets, but are relatively sparse in smaller markets. Furthermore, CMOs tend to open schools in regional markets—that is, there are very few CMOs that have national ambitions.
CMOCitySchools List of NYC CMOs Counted in Study:
  • Achievement First
  • Beginning with Children Foundation – not included in survey because it had less than 3 charters open in 2007.
  • Green Dot
  • Harlem Children’s Zone - not included in survey because it had less than 3 charters open in 2007.
  • Harlem Village Academies - not included in survey because it had less than 3 charters open in 2007.
  • Lighthouse Academies Inc
  • St. Hope
  • Uncommon Schools

Not Included in Study but Included in My List: (Most were not included in the Mathematica study because they had only one school open in 2008-2009)

  • Ascend Learning Inc
  • Believe High School Network
  • Boys and Girls Harbor Inc
  • Explore Schools Inc
  • Hyde Foundation
  • Icahn Associates Corporation
  • Public Prep
  • Ross Institute
  • Success Charter Network

June 22, 2010

KIPP Study Shows High Rate of Retention

A report released today by Mathematica, a non-partisan policy research center, examined the impact of 22 KIPP Charter Schools on student achievement. The report found that KIPP students performed better than their traditional public school peers and that their performance halved the black-white achievement gap.

Another interesting finding is that KIPP schools retained students—that is, made them repeat a grade—more frequently than their traditional public school counterparts. They write:

 “KIPP’s commitment to high expectations of students does not encourage social promotion. KIPP expects students to meet their standards for being academically prepared for the next grade before they will be promoted. Consequently, KIPP middle schools retain students at significantly higher rates than other public schools in the same districts.”

Indeed, while the researchers found the evidence inconclusive with regards to relative attrition rates and relative achievement levels of incoming students, they found strong evidence that KIPP holds back 5th and 6th grade students at a rate far higher than traditional public schools.

This result augments my earlier report, which found that the majority of cohort “attrition” detailed in the UFT report on “Vanishing Students” was actually due to retention at charter schools. Let's hope that there is further research into the impact of retention and achievement at charter schools.

June 18, 2010

MAP ALERT: School Closures, 2003 - Present

As summer approaches, I’ve decide to tackle some big projects - one of which is to look at the effects that school closures have had on the remaining schools in the surrounding area. To get started, I’ve created a map that plots all 111 schools that Chancellor Joel Klein has closed since 2002, including the 19 schools whose fate is still up in the air. Take a look and let me know how you think it can be improved. I’d also love to hear your thoughts on how to best approach this issue!

Buried in the New School Report, the Start of an Answer

In its focus on school leadership, the New School's recent report, Managing by the Numbers, provides a partial answer to a question I’ve long harbored: what happens to incompetent principals?

Teachers  accused of incompetence or misconduct have been sent to notorious “rubber rooms,” where they await a hearing, known colloquially by its paragraph number in the Education Law, 3020a. While waiting, teachers are paid their full salary until the dispute is resolved. (For more on teacher termination, these articles are particularly detailed.)

Yet for principals, the process is less commonly discussed. Are there principal rubber rooms? Who is in charge of documenting principal incompetence? And just how hard is it to terminate a principal?

Managing the Numbers offers some answers. Each principal signs a contract with the DOE in which they agree to be held accountable for academic results. Principals are then evaluated based on a principal performance review, which weighs student progress (in the form of the much-maligned Progress Reports), the student population served, the principal’s compliance with budgeting and school services, and the principal’s ability to articulate and implement five goals for the school. There is a fifth component—an outside quality review—that is optional for schools that do well on their Progress Report. The largest parts of the evaluation are the Progress Report and the self-assigned goals, which basically mean that if a principal does well on the Progress Report, there isn’t much incentive to dig deeper.

As soon as a principal receives a low Progress Report grade, he or she can be transferred to another school, moved into an administrative position, or, if the conduct was particularly bad, terminated. The procedure for termination for a principal is similar to that of a teacher in that it includes a 3020a hearing, but the principal is suspended without pay until the hearing is resolved.

Under this new accountability framework, there has been an enormous turnover in the principal ranks. Over 80% of the current crop of principals started with the DOE after Klein took office.  More recently, though, principal turnover has declined. Annual retirements have decreased from 11.8% in 2002-2003 to 3.8% in 2007-2008. 

To be clear, there is a distinction between principal removal and principal termination. The report suggests that principals whom the DOE wishes to remove from schools are simply sent into DOE staff positions – not fired, and not sent to rubber rooms.  Anita Gomez-Palacia, executive director of operations for the principals' union, is quoted as saying: “It’s very hard to prove incompetence based on test scores in the school building. It’s hard to prove the administrator is the cause of the problem.” In other words, it’s easy to get a principal out of a school, but there is less incentive or less drive to get a principal off the DOE’s payroll.

This information is helpful, but still leaves me with more questions. Just how many principals get removed from their classrooms? Is it possible to remove a principal that students and teachers may dislike but who has achieved good test results? How many principals have actually been terminated? What is the hearing process like—and how is it different from a teacher’s 3020a hearing? I would love feedback from principals and teachers to get a better sense of how principal accountability differs from teacher accountability.


--> -->

My Other Accounts


Or enter your email address:

Delivered by FeedBurner

Blog powered by TypePad
Member since 11/2008