Eva’s Offensive

After many months of intense scrutiny and criticism, Dr. Eva Moskowitz, the founder and CEO of Success Academies Charter School Network, has gone on the offensive.  In this effort, she has the help of an expensive PR firm, her traditional ally the Wall Street Journal, the Harvard Club of New York, and–surprisingly–WNYC reporter Beth Fertig.

The recent criticism began last October, when the PBS NewsHour exposed her practice of multiple out of school suspensions of 5-, 6- and 7-year-olds. (My last piece for the NewsHour before I retired.)  Later in October Kate Taylor of the New York Times revealed that one of her schools had a ‘got to go’ list of students to be dropped. Moskowitz did not fire the principal.   In an electrifying report in February, Taylor wrote about a video of a Success Academy teacher humiliating a child.

Dr. Moskowitz has retained Mercury LLC, the same PR firm that is advising Michigan’s embattled Governor, Rick Snyder.   She emailed her staff accusing the New York Times of a ‘vendetta’ against her.  On Monday, March 14, the Wall Street Journal published her op-ed, “Orderliness in School: What a Concept”.  “Over the past year the Times’s principal education reporter has devoted 34% of the total word count for her education stories, including four of her seven longest articles, to unrelentingly negative coverage of Success,” Moskowitz wrote.

But her main point was that she and Success Academies represent the last line of defense against violent and disruptive behavior in our schools.  Did the PR firm suggest she tar her critics with the old reliable “commie-pinko” brush?   (Making it parenthetical was a nice touch.)

The unstated premise is that parents are susceptible to being duped because they are poor and unsophisticated. (Once upon a time, this view was known as “false consciousness”—the Marxist critique of how the proletariat could be misled by capitalist society.)”

The Harvard Club of New York is, perhaps inadvertently, also helping Moskowitz.  It has scheduled an evening presentation on Monday, March 29th.  The blurb describing the event makes no mention of any criticism.  Here’s a sample:

Eva Moskowitz founded Success Academy Charter Schools in 2006 with the dual mission of building world-class schools for New York City children and serving as a catalyst and a national model for education reform to help change public policies that prevent so many children from having access to opportunity. Firmly believing that inner-city students deserve the same high-quality education as their more affluent peers, and convinced that all children, regardless of zip code or socioeconomic background, can achieve at the highest levels, she opened the first  Success Academy in Harlem and today operates 34 schools in some of the city’s most disadvantaged neighborhoods. Success Academy continues to grow at a rapid pace and will be hiring more than 900 teachers and other personnel before the next academic year.  

The event is open to Club members and their guests.  (I cannot attend because I will be out of the country.)

Moskowitz’s most surprising ally in her PR offensive is Beth Fertig of WNYC public radio here in New York. She and her colleague Jenny Ye reported on March 15 that NYC Charters Retain Students Better Than Traditional Schools.’   The lead sentence: “Citywide, across all grades, 10.6 percent of charter school students transferred out in 2013-14, compared to 13 percent of traditional public school students.”  They cite the KIPP charter network as having an especially low attrition rate, about 25% of the rate in neighboring traditional schools.

This is like comparing the kids who go to the playground to toss a ball around with the kids whose parents enroll them in the karate program at the Y, buy them uniforms and accompany them to practice and competitions.

Of course the departure rate from traditional urban public schools is higher. Families lose their homes and have to move. Parents change jobs and have to move.  The single parent gets sick and has to move in with relatives.  And generally the kids then move to the closest school.  I.E., they ‘transfer.’

On the other hand, getting into a charter school entails jumping through hoops, often a lot of them, and those parents–who have sought out what they hope to be better opportunities for their children–are not going to change schools just because of a job loss or an illness.   Some charter school students may ‘transfer’ because their school doesn’t provide the special education services they’re supposed to.  Some students may ‘transfer’ after being sent home multiple times for minor offenses.  That seems to happen quite often at Success Academies, which has a long laundry list of offenses that warrant out of school suspensions.

Therefore, comparing transfer rates is meaningless, a waste of the reporter’s time and public radio’s resources. Every well run charter school should have attrition rates as low as KIPP’s, or lower.

This silly and meaningless exercise in comparing unconnected numbers makes Success Academies look good.  Although SA had the second-worst attrition rate (57.4% of traditional schools), that’s not how WNYC presented it. Here’s what they wrote:

We found most of Success’s 18 schools in the 2013-14 school year had attrition rates that were lower than those of their local districts. The two schools that were slightly higher are in Bedford Stuyvesant and Cobble Hill (where the first grade teacher was caught on camera).Bed Stuy 2’s attrition rate was 13.4 percent versus 12.4 percent for traditional public schools in District 14. The Cobble Hill school’s attrition rate was 12.5 percent versus 10.8 percent in the regular District 15 schools.

“Our low attrition rates reflect what parents appreciate about our schools,” said Success founder Moskowitz. “That our classrooms are as joyful as they are rigorous.”

Allowing Success Academies to boast of a supposedly low attrition rate is beyond ironic, because the organization plays fast and loose with numbers.  The single most accurate way to calculate attrition is to list everyone who starts school on Day One and then count again 365 days later on the following Day One to determine who has left.  Then the school could figure out why students left and, where appropriate, make adjustments.  This is what KIPP does.  The resulting number is not necessarily flattering because it includes everyone who left: some families move out of town, some kids decide they don’t want to work that hard, other kids just want to be with their peers, and so on.  

A second way to count attrition would be from September 1 to June 25th or whenever school lets out. That leaves out summer loss, which actually is pretty significant.  Other charter networks I am aware of count attrition this way.

The third way produces the lowest and most impressive attrition number: That entails counting attrition from the official NYC attendance count day, known as  BEDS. That occurs fairly early in October.  So a school can count the number of students on October 10th and count again on June 25th.  Doing it this way means that whatever happens between the true opening of school (late August or early September) and BEDS does not show up on any records.  So if a charter network employs multiple out-of-school suspensions during that 6-week period, August-25-October 10, no one outside of the network would EVER know.

Approach #3 is the one taken by Success Academies, which is why Moskowitz boasts of low attrition.

The Eva Moskowitz mess is emblematic of a larger problem in charter world. Greg Richmond is the President and CEO of the National Association of Charter School Authorizers.  I suggest everyone read his recent speech about the state of the charter movement. Three paragraphs seem particularly relevant to this discussion of Success Academies.

We have created schools that will not enroll students in upper grade levels. We have some schools that believe it is appropriate to counsel children out mid-year. Some charters believe it is appropriate to tell families of students with disabilities that their charter school cannot serve them.

In short, charters have relied on the district schools to be a safety net for students not served by charter schools. That’s not right. If we believe that charter schools can provide a better education for children, we need to include all children.

Charter schools have also chosen to fight against school districts even when it was in the public interest to work together.

Eva Moskowitz is fighting  hard to maintain her position as the face and voice of the charter school movement here in New York City–and perhaps beyond.  In my private conversations with leaders of other charter networks here, they have told me that they wish this weren’t so, but so far they have not been willing to stand up and be counted.

 

38 thoughts on “Eva’s Offensive

  1. I have just deleted these two paragraphs about WNYC’s methodology, in deference to the explanation provided by the research team. “In its analysis, WNYC did not factor in students who dropped out over the summer, which KIPP told me is important information. WNYC explained: “We counted attrition as the number of students discharged from every school from July 2013 through June 2014, excluding students who enrolled for the fall and left over the summer or during the first week of September.”

    WNYC says it began counting after the first week of September, but those numbers are not publicly available, not even to the City or the State.  Remember, counting begins on BEDS day.  “We compared this to the sum of the students enrolled as of October 31, 2013 and the number of students who transferred before November (our adjusted enrollment number).”

    Rather than get in a back-and-forth about this, I am removing the paragraphs. My point is that this research is misleading and meaningless. It was a waste of resources (and I contribute to WNYC, by the way). Worse yet, it takes the public’s eye off the ball.

    To repeat myself, shouting this time: A TRANSFER RATE COMPARISON BETWEEN TRADITIONAL AND CHARTER SCHOOLS IS MEANINGLESS….

    Like

    • Thank you for reporting on this.

      This is the second time that Beth Fertig has used exactly the same methods that Success Academy prefers to report on attrition rates of charter schools. Instead of compiling longitudinal data like that in the July 2015 NYC Independent Budget Office Report that shows that 53 charter schools (4 of which are Success Academy schools) lost 49.5% of their entering Kindergarten class by 5th grade, she chose to only use a single year’s snapshot (really only 9 months). And even then, as you noted above, she chose to make sure that any student who entered a charter school in August and was drummed out in the first few weeks somehow was thrown out of the data. Why? Did WNYC provide an answer to why they chose the Success Academy model of calculating attrition? It’s a very odd approach for a journalist to exclude so many students from the data. But perhaps not surprising.

      Beth Fertig’s so-called “attrition study” 4 years ago has been often cited by Success Academy and their supporters as the definitive “proof” that they welcome all children and try their best to keep them. Now that the public has incontrovertible evidence that the principals and teachers who are most honored by the Success Academy network as “models” use methods like “got to go” lists and humiliating and punishing first graders who don’t learn quickly enough, you would think that Ms. Fertig would have a typical reporter’s curiosity to look closely at why her previous “study” didn’t reveal how these now acknowledged tactics keep attrition rates high. Instead, she doubled down using exactly the limited 9 month data that Success Academy wants reporters to use. I thought that the video evidence of the tactics that one of Success Academy’s model teachers uses might prompt Ms. Fertig to see if the 3 Success Academy charter schools in the July 2015 IBO longitudinal analysis of charter school attrition rates lost even more of their starting Kindergarten cohort than the 49.5% “average” of the charter schools . But I was wrong.

      When truly curious journalists examine the attrition rates, you get studies like the one that The Guardian published a few weeks ago.

      http://www.theguardian.com/us-news/2016/feb/21/success-academy-charter-school-students-got-to-go

      By limiting herself to a single year’s data as Success Academy prefers, Ms. Fertig offers attrition rates that always combine the low attrition rates of the upper (testing) grades at Success Academy elementary schools – whose ranks have already been culled of low-performing students – with the pre-testing years where students are suspended at inordinately high rates and humiliated until they leave. If far fewer of the older students leave — because those grades have already been culled of their “got to go” students and ONLY new students who are pre-tested are added to the mix — averaging in their lower attrition rate hides whether an unusually high number of students who win that original Kindergarten lottery are mysteriously disappearing before 3rd grade testing begins.

      If Ms. Fertig had bothered to look closely at the data at individual schools like Success Academy Bed Stuy 1, she could have presented a more honest snapshot about attrition rates and how “got to go” lists work.

      According to data at data.nysed.gov, Bed Stuy 1 had 103 2nd graders at that school as of BEDS day (October) in the 2013-2014 school year. 68 were economically disadvantaged, or 66%, which is similar, although slightly lower, than the District 14 average. There were 63 boys and 40 girls in that 2nd grade class and 19 students with disabilities.

      But third grade means state tests, and what happened to that 2nd grade cohort in 3rd grade? By the next October (2014-15 school year) it was down to 92 students in total and only 50 were economically disadvantaged. There were 2 fewer girls and 9 fewer boys. However, the number of economically disadvantaged students shrunk by 18 students! – from 68 to 50 – so either some of the poorer students were replaced by more affluent ones, or an unusually large number of families left the free-lunch status but remained in the school. As a result, instead of comprising 66% of the cohort, as they had been at the beginning at 2nd grade, by the beginning of 3rd grade only 54% of the class was economically disadvantaged. The number of students with disabilities shrunk by nearly 50%, and only 10 of those 19 were left by the beginning of 3rd grade. You would think losing that many low-income and disabled students from the cohort would satisfy Success Academy, but it gets even worse when you look at which of those remaining 92 3rd graders in October stayed around to take the spring state tests. Because only 76 students out of those original 103 2nd graders took the 3rd grade exam the next year. So between October and testing in the spring ANOTHER 16% of the 3rd graders disappeared — 16 students. Only 43 out of those 63 2nd grade boys were around to take the test in 3rd grade – nearly 1/3 of them MIA. The number of economically disadvantaged children in that 3rd grade cohort taking the 3rd grade exams was down to 41. So there were 68 low-income students at the beginning of 2nd grade and only 41 low-income students taking the state test the next year. There were 19 students with disabilities at the beginning of 2nd grade and only 6 (!) were around to take the 3rd grade tests. Nearly 40% of the low-income students disappear between 2nd and 3rd grade and aren’t in the testing cohort? 13 out of 19 (68%) of the 2nd graders with disabilities disappear from the testing cohort? And that is not a red flag? Apparently not to Beth Fertig.

      It’s rather stunning that nearly 40% of the low-income 2nd graders and 68% of the students with disabilities who were on the 2nd grade rolls were not part of the 3rd grade testing cohort at Success Academy Bed Stuy 1 this past year. However, the “good” news is that 100% of those remaining 3rd graders passed the state math exams. And if that is the only number that matters and the number of expendable kids it took to get there is unimportant to Beth Fertig, then I can understand why she chooses to do this very limited study designed to provide the perfect cover for their actions. The most shocking news in Ms. Fertig’s WNYC report is that the attrition rate at this highest performing charter chain is 2 or 3 times as high as the attrition rate at charter chains like KIPP, who have mediocre test results. Parents choosing to leave a high-performing charter chain far more frequently then they choose to leave a much lower-performing charter chain and it doesn’t warrant a closer look by a reporter covering the story? Because she believes that the “newsworthy” part of her study is that a wealthy, high-performing charter chain supposedly keeps more students than an underfunded failing public school with little to offer students? No wonder Beth Fertig is Success Academy’s go-to “expert” on charter school attrition. By defining “attrition” in exactly the way that hides what is being done to far too many unwanted at-risk students who win the Success Academy lottery, Ms. Fertig might as well be on the Success Academy payroll.

      Like

  2. Nice comments John, although I think Moskowitz was right to point out that the Times leaned too heavily on the dubious notion that poor urban parents are too dumb or distracted to judge intelligently the quality of the education their children are getting in schools like the Success Academies.

    Liked by 1 person

    • The NY Times did not “lean heavily on the dubious notion that poor urban parents are too dumb or distracted to judge intelligently the quality of the education their children are getting in schools like the Success Academies.” Please cite evidence for this if you are going to make such a claim.

      What the NY Times showed is that if you are an at-risk child, whose low-income single mom is living in a homeless shelter, and you don’t make the grade in your academic work, you will be punished and humiliated for being “bad”, although “bad” means the teacher believes the child is purposely not getting the right answer fast enough and is “ruining it for the other children”. The other parents who spoke up to defend Success Academy seemed to all be college educated middle class parents. Their children are treated fine because they already work at or above grade level. If the homeless shelter child was only a more superb student, she too would be fawned over, but she wasn’t so she got the famous “humiliate” and “punish” treatment that the model teachers are rewarded for. That’s what the video made very clear. And when the model teacher makes sure to humiliate those poor performers in front of their peers, it encourages the peers to treat the child like she is worthless as well. Whether or not that method works to teach the child is besides the point as long as it works to get rid of her before 3rd grade. And it did! If it had not, she would have very likely been held back a second year and a third year until her mother got the message that she wasn’t wanted.

      Remember – Eva Moskowitz herself justifies high suspension rates by saying those Success Academy children act out in violent ways. As Mr. Merrow’s October report showed – a typical Success Academy school may suspend 32 of 132 Kindergarten and first graders at least 3 times each. That’s 24% of those 5 and 6 year olds acting violently. Now that we see how “model” teachers treat the struggling children, it is clear why such a inordinately high number of students in Kindergarten and first grade would be acting violently in Success Academy schools.

      What the NY Times and the middle class Success Academy parents’ response did show is that some parents — especially the middle class ones — don’t really care how other children are treated as long as their own child gets the luxuries offered by the charter and are fawned over. You hear less defense of these actions from the Success Academy parents who are also low-income whose higher achieving kids are only allowed to stay as long as they keep performing. They often don’t have good public schools to fall back on, but I guarantee you that most of them aren’t publicly defending the actions of that teacher like all the affluent college-educated parents in Cobble Hill were doing.

      Do charter schools that suspend their very youngest students the most have higher attrition rates than the ones that rarely suspend their students? From the data that Beth Fertig collected, where the highest suspending charter school — Success Academy’ — also seems to have an attrition rate double that of most charters — the answer seems to be a resounding yes. It’s a shame that Beth Fertig is so focused on pretending that the high attrition rates are perfectly reasonable that she missed the bigger issue. It’s hard for me to believe a better reporter would have missed the biggest news in the data. I hope a real journalist follows up.

      Like

  3. Good piece. I was thinking the same thing about not including students who leave over the summer, it’s ridiculous. Have you read this Shanker Blog piece that uses a better way of counting students at Success?

    http://www.shankerinstitute.org/blog/student-attrition-and-backfilling-success-academy-charter-schools-what-student-enrollment

    Also, what is her explanation for not backfilling? It’s clear this provides her schools with an advantage

    Like

  4. I wish you would look into Denver and its “high performing” charter network, Denver School of Science and Technology – DSST. The graduation rate from the longest established high school is 56.5% based on number of freshmen starting and number of students graduating (the same methodology as used by the Center on Reinventing Public Education that rankedDenver Public Schools as 45th out of 50 in graduation % in urban districts. I have asked for data on where students leaving DSST go and why students are leaving this school. DSST says it doesn’t keep that information. The first District response indicated it, too, doesn’t keep that complete information. I await further information.

    Meanwhile Denver will soon have 11 DSST high schools, graduating about 60% of their incoming freshmen. Hmmmm. Where do those other 40% go?

    Jeannie Kaplan
    Former member, Denver Public Schools Board of Education

    Like

    • Jeannie, here is another perspective on Denver data:
      http://co.chalkbeat.org/2013/06/10/denver-charters-retain-more-students-keep-them-longer/#.Vu1xwPkrKM8
      “Of the four charters, DSST-Stapleton reported the lowest four-year student attrition rates. Of the 131 students who began their studies at the school in 2009, 112 – or 86 percent – still attended the school four years later.”
      Perhaps you are referring to 4-year graduation rates while a significant number of DSST students graduate in five?

      Like

      • I would like to see where the 112 number comes from. According to all of my data – which come from DSST, DPS or CO Dept of Ed – starting freshmen for DSST Stapleton- in 2005 – 972 . Graduated from 2009-2015 – 549 = 56.5%. DSST at green valley ranch had its first graduation in 2015. Freshman class started with 145. 86 graduated. 59.3% graduation rate.

        I do my own data analyses because…chalkbeat often uses data I find unreliable. Or spun.

        Jeannie

        Sent from my iPhone

        >

        Like

    • Thank you for bringing DPS’s “we actually only serve our top 60-80% of students” to the table. WHERE ARE THE OTHER 20-40% — and why are there no records to follow through with those students who are so viciously pushed out of our district’s “public” educatiol system?

      Like

    • No matter how many “fact-finders” out there want to argue against what Jeannie Kaplan is saying about a WE ONLY SERVE OUR TOP 60-80% OF STUDENTS Denver district, I have personal experience with too many students kicked out of our district’s “exclusive” schools — and then pushed into a statistical oblivion. Their stories are not made up. Their stories are painful, and very, very real.

      Like

  5. Jeannie, does the figure you provide for “starting freshmen” perhaps includes quite a few freshmen who repeated freshman year? If not, how did you derive it?

    BTW, line 546 of the 2005 data that I find here:

    http://www.cde.state.co.us/sites/default/files/documents/cdereval/download/spreadsheet/2005pm/school/05-06pmbygrade.xls

    seems to show 131 DSST freshman and 106 sophomores in 2005. One could guess that there had been 131 freshmen in 2004 and 25 of them dropped out, or one could guess that there were 119 freshmen in 2004 and 13 were retained and only 1 dropped out. Or some circumstance in between those possibilities. In any event, I think you may be underestimating the effects of grade-level retention in the figures you’re examining.

    From that chalkbeat article: “Sometimes, DSST’s Kurtz said, getting a student the help they need to get to college involves asking them to take an extra year of high school. The school held back 19 students, or just under 30 percent, of its original 2009 cohort.”

    Like

    • That’s fascinating because Success Academy often “retains” huge cohorts of 2nd graders, too. And first graders. And Kindergarten students. Most come from low-income families and some are even held back more than one year. Some get held back before they even spend a day at school! Those new lottery-winning incoming 1st graders take a test and if their “Kindergarten education” at another school is considered lacking by the higher standards Success Academy demands of the students starting in first grade, they are given a choice to either accept a spot in a lower grade (repeat a year instead of joining their rightful class) or not enroll at all. If by “fortuitous” chance that “not up to snuff” 6 year old decides not to repeat Kindergarten again, that’s an extra spot for the next child who presumably can do a better job proving his worth as a student.

      Of course, if a public school failed a quarter (or more?) of the at-risk kids, the anti-public school folks would be screaming about how the teachers failed to educate all those students. If a charter school fails those kids, it is only because they have “high standards” and the child just hasn’t worked hard enough to meet them and needs to be punished and humiliated until he starts working harder.

      Stephen Ronan, you seem to imply that probably lots of kids were held back. But where are the statistics that the “retained” students ever graduated from that school? Or did they finally drop out after turning 18 and sitting in a room with 14 year olds? Shades of the held back 8 and 9 year olds sitting with 6 and 7 year olds because they “aren’t working hard enough” to be allowed to take the 3rd grade standardized tests. What is more likely, that next year when they are told they will still be in 2nd grade they leave? Or that they are allowed to finally be 10 year old 3rd graders and eventually graduate from 5th grade at age 12 or 13? Those kind of “retention” policies do go a long way in explaining how a charter school that attracts only the most motivated parents would have so many violent children that over 20% needed to be suspended. But do they go a long way in having kids achieve academically? Or just a long way in encouraging them to look for another school?

      The fact that you don’t seem to have the answer indicates that is a question the “reformers” prefer no one ever asks.

      Like

      • Parent010203:
        I’m on the East Coast and have never previously paid attention to Colorado statistics… But since you ask, here: regarding Dsst: Stapleton High School, I find a spreadsheet for “School Level Data: 5-Year Graduation Rates” which on line 180 shows in column I (All students graduation rates): 96.5%
        https://www.cde.state.co.us/cdereval/20132014_cohort5_graduates_and_completers_by_school_gender_and_raceethnicity
        96.5%? Not too shabby? If you find different, more accurate, info, please let me know.

        Like

      • Stephen –

        Graduation rates are calculated based on four years. All studies that cite graduation rates are talking about four years in high school. Five years and more are usually called “completers” meaning they have completed high school in five or more years. The 96.5 % rate you cite is not too shabby, but it is not a four year graduation rate and it is 96.5% of how many students? What is the cohort they are using to derive this number? The drop out rate for this charter network is unacceptable. And ” education reformers” only talk about improving graduation rates. It is one of the main tenets of “reform” surpassed only by eliminating the achievement gap (which is increasing dramatically in Denver). I ask that honest conversations occur where statistics are not spun and apples are compared to apples. Many Charter schools get rid of students who don’t test well. They are not a solution to educating all children equitably.

        Like

      • Hi Jeannie
        I should have referred to column H rather than column I as the source of that 96.5% “All Students Graduation Rate” figure.
        Doing a quick search at scholar.google.com for: high school “5 year graduation rate” a lot of studies come up that seem to use that measure.
        I had put this hyperlink in angle brackets and it seemed to disappear from my posting:
        https://www.cde.state.co.us/cdereval/gradrates1314
        That web address is where I found that spreadsheet that I cited… You can probably find further answers to your questions there in respect to their methodology and more specifics for DSST.
        I see, for example, that it states: “The completion counts and rates include all students who graduate on-time with a regular diploma plus students who complete on-time with a GED or non-diploma certificate. It is important to note that graduates are included in the completer count and rate, therefore the completion counts and rates for any school or district will be greater than or equal to the graduation rate.”
        From that and from the Column H heading I get the impression that the 96.5% statistic does refer to graduate recipients of H.S. diplomas rather than a GED or non-diploma certificate. Sorry for any confusion caused by my originally misidentifying the column.
        You write: “I ask that honest conversations occur where statistics are not spun and apples are compared to apples.”
        Sounds good to me!
        “Many Charter schools get rid of students who don’t test well.”
        It seems very plausible to me that that sometimes happen, but I haven’t yet seen evidence of how common it is. In any event so far as I can tell that description doesn’t seem to describe DSST Stapleton.
        – Stephen

        Like

      • Stephen, maybe I am missing something but I am not seeing anything in the link you cited that shows the number of students who enrolled as freshmen in that school. The calculation as to whether a charter school is doing well should be a simple matter and if I were an oversight agency I would want to know exactly what happened to every single child who won the lottery in 9th grade (or Kindergarten). Did 96.5% of those students graduate in 5 years? Or did 96.5% of the students who stuck around for 4 or 5 years graduate, which may be a much smaller number than the original cohort?I can’t even tell from these spreadsheet what the total number of freshmen who enrolled at this charter is nor do I know how many stayed at the school and graduated within 5 years. But I agree that if you are guaranteeing that this chart proves that if 100 freshmen enrolled, then 96 of those students (not replacement ones) graduated in 5 years, that is a very good graduation rate for a school that serves a very high needs population. Are the majority of the students low-income?

        There are charters like BASIS which brag of 100% graduation rates and high test scores, but the schools lose so many students along the way that saying “100%” is misleading. Is it somehow better for a charter school to have a graduation rate of 100% of the students remaining (even if that is only 50% of the original cohort)? Or is it better to have a 50% graduation rate when a school not only has to educate the fast learners, but also is making their very best effort and expending money and resources to also teach the slow learners drummed out of charter schools? Or does a public school’s lack of success with most of those slow-learners that charter schools don’t want to teach a sign that their teachers are just terrible? That’s what the pro-charter folks say. And in New York State at least, charter “oversight” agencies like SUNY (I use the term “oversight” jokingly, of course) are so much more impressed with whether most of the students in the charter pass the state exams, and don’t care at all how many kids are drummed out. They have Beth Fertig’s “data” so that’s all the proof they need that most kids are staying. Why should an oversight agency actually bother to look at what happens to the original students (in your example, the 9th grade enrollees) if looking too closely at that might show that the charters with the best results have unusually high attrition rates for a SUCCESSFUL public school.

        But I can’t tell from this link whether this data demonstrates a charter school good at educating students — including those who are hard to teach — or a charter school good at drumming out the lowest performing students. Maybe you can.

        Like

      • I think you pose excellent questions, parent010203. It appears that in calculating graduation rates, the state’s denominator excludes transfers out to other schools while including transfers in from other schools. In respect to 4-year graduation rates, they state:

        “What is the graduation rate? The graduation rate is a cumulative or longitudinal rate which calculates the number of students who actually graduate as a percent of those who were in membership and could have graduated over a four-year period (i.e., from Grades 9-12).

        “A graduation rate will be reported for each particular graduating class (i.e., the Class of 1994). The rate is calculated by dividing the number of graduates by the membership base. The membership base is derived from the end-of-year count of eighth graders four years earlier (i.e., in the spring of 1990), and adjusted for the number of students who have transferred into or out of the district during the years covering grades 9 through 12.”
        https://www.cde.state.co.us/cdereval/rvdefine

        So it is reasonable to hypothesize that DSST Stapleton transferred out the students least likely to graduate. But the power of that as a potential explanation for their having an uncommonly high graduation rate seems undercut by the fact that they have an extraordinarily high stability rate of 95.7%. “The stability rate represents the number and percent of students who remained at a school/district without interruption throughout the school year.”
        https://www.cde.state.co.us/cdereval/mobility-stabilitycurrent
        https://www.cde.state.co.us/cdereval/school_mobility_rates_by_genderethnicity_20142015

        One could then hypothesize that they are carefully drumming out lots of students in between school years. However the DSST Stapleton “re-enrollment rate, the percentage of students who return to a particular school from one year to the next” as of 2015 was reported to be 94.2%, far above the district average. See:
        https://web.archive.org/web/20150323042309/http://www.denverpost.com/carroll/ci_27474391/do-dsst-schools-have-an-unfair-advantage

        I guess one might also hypothesize that the relatively few students transferring out are better prepared to graduate than their peers who have never had any opportunity to attend the school.

        I would certainly agree with your implication that the presentation of statistics by many state education departments could likely be substantially improved in their capacity to enlighten us on the kinds of crucial questions you pose.

        Like

      • My figures regarding DSST graduation rates come from:

        1) the number of freshmen taking Colorado’s standardized tests from 2005 (DSST Stapleton’s first year of existence) through 2012 (the year the class of 2015 would have been freshmen). I had to go through each year of tests to get this number.

        2) the number of graduates from the same school came from the school itself, DPS graduation numbers, or CDE graduation numbers.

        The numbers are as stated above: 972 entered as freshmen. 549graduated. 56.5%. The green valley ranch DSST has had one graduating class. 145 freshmen, 86 graduated. 59.3% graduation rate.

        What happened to those who didn’t make it? Where did they go and why did they leave? DSST doesn’t keep that information and told me to call the district which I did. The District told me their data system only tracked intra-district transfers. I am still awaiting that information.

        It strikes me as very odd that a data driven system isn’t keeping data on two very important data points, especially when there are more DSSTS on the way. And the numbers Stephen cites make no sense, even with repeaters. DSST’s very high attrition rates are not acceptable. This is not how public schools work.

        Like

      • “the number of freshmen taking Colorado’s standardized tests from 2005 (DSST Stapleton’s first year of existence) through 2012”

        How do you avoid double-counting those who repeat their freshman year?

        Earlier you wrote: “Graduation rates are calculated based on four years. All studies that cite graduation rates are talking about four years in high school.”

        Does that mean that you only factored in 4-year graduation rates?

        If it happens that you double-count those who repeat freshman year in constructing the denominator in your calculation, and then don’t include them at all in the numerator if they graduate in five years, that would help explain why the graduation rates by your method are lower than the official state statistics.

        It’s difficult to imagine a clear, simple way to calculate graduation rates that wouldn’t result in some distortions/misperceptions in certain circumstances. I can’t off-hand think of a better method than what the state education department apparently uses, but would agree with you and parent010203 that that could be helpfully supplemented by further information about what happens to those who transfer out of schools.

        Best wishes,
        – Stephen

        Like

      • “I can’t off-hand think of a better method than what the state education department apparently uses…”

        Say what?

        How about a method where the denominator is the total number of 9th graders who started the first day at that high school. The numerator is the total number of those SAME students who graduated in 5 years from that school. That is a real graduation rate. If you use a numerator other than that, your graduation is, frankly, meaningless. For example, if a school loses 40% of their low performing freshmen, and recruit even more higher performing students to come in junior year and replace them, they can have a “graduation rate” of 120%. And what appears as a rousing “success” would hide the fact that they failed with 40% of the students.

        In my suggestion, which is how scientists would measure the value of a drug or treatment, once you know what % of the original cohort made it to graduation, you can look more closely at why the numbers are what they are. If the school has a 97% graduation rate using the system I suggest, obviously it is doing something right. But having a 65% graduation rate may actually reflect a school doing better than one with an 80% graduation rate. If the 20% of the students who leave one school were leaving for another public school, who are those kids and were they top performers or lowest performers at the charter? If almost all the 35% of the non-graduating students in the other school had transient families who moved to a different city, that reflects something else.

        But it is impossible to do this kind of comparison if you START with a graduation rate that hides the number of those starting 9th graders who graduate from the school. Then the intent is more about PR than real research. That is why scientists would look at a treatment effect based on the original cohort and would never simply say “hey, it works on 99.5% of the people” when half the starting cohort mysteriously disappears. A scientist would say “hey, it works on 50% of the people, let’s see why it didn’t work on the others. Did they drop out due to bad side effects? Did they drop out because they were moving away and could no longer be part of the study? And the bigger that “drop out” number, the closer the scientists would look to make sure there wasn’t an attempt to get rid of the patients who didn’t do well with the treatment. With education, the faux researchers are desperate to STOP that kind of analysis of charter schools. That’s why they are so fond of studies like the Beth Fertig one that Mr. Merrow cites here, which has already been lauded by the pro-charter movement as offering definitive “proof” that charters don’t lose students! The pro-charter folks aren’t really interested in knowing how their schools are doing in educating at-risk kids. If they did, they would understand how meaningless that kind of “data” really is. If a study like this was published in a scientific journal with the conclusions it was making, it would be laughed out of science.

        Like

      • Parent parent 010203

        Thank you for explaining how the data should be analyzed. If you look at the number of freshmen starting at DSST and then follow that class to sophomore year, you will see a significant drop in students. And then look at the freshman class for that following year (when year one freshmen are now sophomores), you will not see a higher number of freshmen, so even if some are retained – and I do not think more than 10% are ever retained per year – the class size does not increase significantly. And 10th grade is the last year DSST admits new students. From then on it just loses students.

        Losing 40% of any class of students is not acceptable, especially when neither the charter organization nor the public school district responsible for educating children has any real idea why these loses are occurring or where these students are landing. At least that is the spin they are promulgating. My guess is neither entity wants the public to know the games being played. Accountability? Only for those who have no real control over the measures being evaluated. No accountability for the decision makers real enforcers .

        Like

      • I feel like we’re mostly in agreement… each method we can think of has significant potential flaws in different scenarios. And it’d be best therefore for education departments to facilitate a multiplicity of approaches to graduation data analysis.

        The method Colorado uses lends itself particularly well to aggregations of city-wide, statewide or national data where transfers in and transfers out tend to balance out. People if forced to choose would generally prefer an answer to how likely is a child to graduate high school than an answer to how likely is a child to graduate from the same school that he or she was in at the start of 9th grade.

        So, I’d see your preferred method as a helpful supplement rather than substitute for the method Colorado uses.

        “But it is impossible to do this kind of comparison if you START with a graduation rate that hides the number of those starting 9th graders who graduate from the school.”

        I’m not persuaded. Starting with either type of graduation statistic, one could continue on to look at stability/attrition/re-enrollment rates, examine how many students leave a school early and examine why.

        A key question that concerns us is the degree to which charter schools transfer out students least likely to successfully progress academically. Perhaps an additional route to getting some sense of that is to look at the results in traditional district schools when charter schools arrive. As one measure, if it is the case that charter schools generally accept populations most likely to graduate and then push out those least likely to graduate, we’d expect to see corresponding, deteriorating graduation rates in traditional district schools, wouldn’t we?

        On the other hand, a charter school association in Massachusetts provided this tidbit of evidence:

        “In 2009, one Boston charter school reported the following about students who returned to the district:
        “90% of the students who returned earned a high school diploma; the district average is 60%;
        “95% of the transfers were proficient on MCAS. Among this same group of students, only 10% of them had been proficient when they first enrolled at the school. So, the students who were supposedly dumped back to district schools were stronger academically than average BPS students and had progressed academically while attending the charter school.”

        If it is the case that attendance at a charter school boosts chances of graduation, transfers back to district schools might have a relatively benign effect on traditional district school graduation rates.

        If you come across relevant data, I’d be interested to know.

        BTW, tangentially, what do you think of the methodology used in the Stanford CREDO studies?

        Like

      • Stephen Ronan, you wrote “In 2009, one Boston charter school reported the following about students who returned to the district: “90% of the students who returned earned a high school diploma; the district average is 60%; “95% of the transfers were proficient on MCAS. Among this same group of students, only 10% of them had been proficient when they first enrolled at the school. So, the students who were supposedly dumped back to district schools were stronger academically than average BPS students and had progressed academically while attending the charter school. If it is the case that attendance at a charter school boosts chances of graduation, transfers back to district schools might have a relatively benign effect on traditional district school graduation rates.”

        The red flag: “Among this same group of students [who leave charter schools for district schools], only 10% of them had been proficient when they first enrolled at the school.”

        But something important is missing: This information tells us that the very high % of the students who left charter school for a public school started at below proficient. Are the ones who the charter school retained disproportionately the ones who started proficient and above? In other words, if only 30% or 50% of the original enrollees were not proficient, and yet nearly 90% (or more?) of the ones who left were not proficient, my next question would be “what percentage of the students who came in below proficient were retained by the charter and graduated?” Aren’t you asking yourself that question? Because there is a big difference between retaining 50% of the kids who come in below proficient and retaining 90% of them. And fact that those students did okay in the public high school where they ended up does not mean that it is no longer important to ask that. On the contrary, it is very important.

        It’s odd that your “conclusions” to that study would be that spending any time whatsoever in a charter school means that the charter school should get credit for the better performance. Most people would look at that and think that charter schools are very fortunate to only have students with parents who are committed to their education, so that even if they leave a charter school, they are likely to do better academically than a cohort of students that also includes transient and completely disinterested students from families who are dysfunction and otherwise completely unengaged in their education.

        Like

      • I think your question, parent010203, of whether those transferring out of charter schools tend to be those whose testing had demonstrated the least proficiency when they entered is a good one. Still it may be worth noting that, for example, the kinds of traditional district middle schools from which Boston charter schools normally draw would more likely show percentage proficiency rates in the teens or twenties rather than the fifties or sixties of your hypothetical example.

        I don’t entirely follow the logic of your last paragraph. You imply that, when a group of children demonstrates a 10% proficiency rate prior to attending a charter school and 95% afterwards, the key factor is their having parents committed to the children’s education. But presumably the set of parents remains largely unchanged before, during, and after the charter school experience.

        At the same time, obviously you’re right to suggest that attempts to measure impacts of charter schools should be careful to try to control for the level of investment, engagement, support that parents provide. In that respect, I would wonder again what you think of the methodology of the Stanford CREDO studies, e.g., this one:

        Click to access Urban%20Charter%20School%20Study%20Report%20on%2041%20Regions.pdf

        Like

      • Stephen Ronan, Mr. Merrow’s post was about whether this study done by WNYC did anything beyond “comparing the kids who go to the playground to toss a ball around with the kids whose parents enroll them in the karate program at the Y, buy them uniforms and accompany them to practice and competitions.” In other words, a pretty worthless study if your purpose is to actually see if attrition is weeding out poor performing students or not. Do you agree? Because if you believe that this so-called “study” provides evidence that high performing charters don’t weed out low performing students, then anything I post isn’t going to change a view that you already hold dear.

        “….the kinds of traditional district middle schools from which Boston charter schools normally draw would more likely show percentage proficiency rates in the teens or twenties rather than the fifties or sixties of your hypothetical example…..”

        Having a middle school with a proficiency rate in the “teens or twenties” means that the top fifth of the middle school students are ideal candidates for the charter. Why wouldn’t you assume that the top performing students didn’t make up the majority of students who started in that charter? Some charter schools (perhaps not in Boston) discourage low-performing kids who win lotteries by giving them a test before the first day and telling them that they have to repeat the grade they just graduated from — a sure-fire method to make sure some of those kids turn down the spot. So instead of just assuming, let’s just look at the data. What IS the percentage of students who enrolled in that Boston charter who were not proficient? If the study knows the proficiency level of the students who left — 90% of them were below standards — then it obviously knows the proficiency level of the entire cohort and the ones who remained. What was it? It’s a simple question and answers like “the middle schools from which they drew more likely show low proficiency rates” just seem deceptive when obviously this data is known. What are the actual numbers? Did a disproportionate number of the non-proficient students leave?

        To address your question about the CREDO study: it seems like an utterly pointless exercise.

        Bottom line is that some charter schools do better and some do worse than public schools. In some cities there are more charters that do better than do worse, and in some cities there are far more charters that do worse. But this study isn’t interested in why, which is very odd. What are the characteristics of the charters that do better? Isn’t that the point? Are the ones doing better doing so because they weed out the kids who are the most difficult so they can spend all their time and resources focusing only on the students who they can most easily educate? You would see much higher “average” attrition rates in the highest performing charters if that was the case. (By the way, that turns out to be the case looking at BASIS and Success Academy.) Is the lesson learned by public schools to have more magnet schools that accept students by lottery but are free to weed out the low performers? That would be a legitimate conclusion from the CREDO study. Low-income “strivers” will improve more if they are at public magnet schools (they don’t have to be charters) that are limited to students with motivated parents who can do the work required of them and those that can’t must leave.

        A useful CREDO study would have looked at the charter schools that didn’t match the highest performing ones and asked why. Was it because, like the KIPP school in NYC, they had very low attrition rates and were keeping all the students who enrolled? Did they have far more students with serious disabilities and were their ELL students the kind where the child has just arrived from another country and neither parent speaks English? Did the “successful” charter school ELL students have parents highly educated and fluent in English themselves, but they have raised their child to speak a different language at home. There is a difference.

        For a very long time the charter school industry has been unwilling to look at why some charters do better and some do worse. They still are unwilling to do it. And all the CREDO studies in the world keep proving the obvious: that if you take a group of students with motivated parents who remain in a charter school (remain being the operative word), they will do better than a group of students that includes those without motivated parents who are in a public school that has to use their resources to educate all of them. It’s an argument for lottery based magnet schools with no obligation to educate every child who wins the lottery.

        And that’s why none of these studies compare high performing charter school networks to lower-performing charter school networks. I suspect it is because they suspect the not-surprising results of the CREDO study will very likely hold, and the students who remain in charters more ruthless in ridding themselves of under performing students will do better than students in charters who are committed to educating every child. Maybe I’m wrong. But the unwillingness of charter schools to actually look at what the characteristics of better performing charter schools are when compared to lower performing ones is a huge red flag. If anything, it should be the biggest topic of research. Instead of lots of wasted time and money to show that some charters — in some cities a majority — do better than public schools and some charters — in fewer cities a majority — do worse than public schools. What exactly have we learned? Absolutely nothing.

        Like

      • Parent010203: “In other words, a pretty worthless [WNYC] study if your purpose is to actually see if attrition is weeding out poor performing students or not. Do you agree?”

        Pretty worthless? Opponents of charter schools have commonly, mostly mistakenly, substantially exaggerated attrition rates of charter schools. If that has been the case in New York, as it has certainly been elsewhere, then just providing reasonably accurate figures comparing charter schools with traditional district schools would have value. I would agree with you that the study certainly doesn’t rule out the very plausible possibility that some degree of charter school success in the arena of student test-taking is attributable to pushing out students least likely to succeed. The question shouldn’t be does it ever happen. Surely it does. The question is better put as how much does it happen. Mistaken perceptions of inordinately high attrition rates at charter schools allow far more room for that as an explanatory factor than if it is demonstrated that charter schools in fact have significantly lower attrition rates than local, traditional district schools.

        My sense is that, in Boston, “pushout” is a very minor contributor to extraordinary charter school success. According to one of the Stanford CREDO studies, the results for the typical student in a Boston charter equated “to more than twelve months of additional learning per year in reading and thirteen months greater progress in math.” When I first heard it, I didn’t believe it. 12-13 months _additional_ per year? Two+ for the price of one? I thought the study had been cited incorrectly. Couldn’t be possible. But that’s what it says.

        And, as I think you are aware, the CREDO study is certainly not a case of “comparing the kids who go to the playground to toss a ball around with the kids whose parents enroll them in the karate program at the Y, buy them uniforms and accompany them to practice and competitions.” The CREDO study takes great pains to compare the progress of children who enroll in a charter school with carefully matched children who had also participated in the lottery but had not had the luck of the draw.

        I can explain some, but not much of, Boston charter school success by attrition of less-motivated students. I can explain some, but not most of it, by the fact that the students in the traditional district schools have some in their cohort who disrupt their learning (Boston district schools, by the way, on occasion resort to moving disruptive students out to an excellent, separate facility where there’s “Focus on emotional, behavioral, and learning needs”, “Highly structured behavior management system”, “Intensive clinical supports”).

        You write: “To address your question about the CREDO study: it seems like an utterly pointless exercise… Bottom line is that some charter schools do better and some do worse than public schools. In some cities there are more charters that do better than do worse, and in some cities there are far more charters that do worse. But this study isn’t interested in why, which is very odd. What are the characteristics of the charters that do better? Isn’t that the point?”

        In order to determine the characteristics of systems where charter schools are most successful, it is a rather helpful exercise to first identify which schools in which locales are most and least successful. At least in respect to one measure, tests of academic progress, who does that better than Stanford/CREDO, with its matched controls?

        These are among the conclusions of the Center for Public Education after the 2009 CREDO study:

        “States that allow multiple authorizers—from municipal agencies to colleges and non-profits—had the weakest student achievement data for charter students when compared to students at traditional public schools. A rigorous study by the Center for Research on Education Outcomes (CREDO) found ‘a significant negative impact on student academic growth’ for charters in states that allow multiple agencies to authorize these schools. In effect, CREDO said, the presence of multiple authorizers allows charter organizers to ‘shop’ for the most advantageous route to approval. Similarly, the RAND Corporation (Zimmer et al 2009), in analyzing eight states, found that Ohio had ‘an especially wide range of variation’ in achievement, which the authors attributed to the state’s ‘unusually diverse group of organizations to serve as charter authorizers.'”

        You write: “But the unwillingness of charter schools to actually look at what the characteristics of better performing charter schools are when compared to lower performing ones is a huge red flag.”

        May I commend to your attention this book, particularly the chapter on Match Charter School: “Inside Urban Charter Schools: Promising Practices and Strategies in Five High-Performing Schools” (2009) by Katherine K. Merseth and Kristy Cooper?

        Like

      • I didn’t see the answer to my question:

        What is the percentage of all students who enrolled in that Boston charter cohort who were not proficient? If the study knows the proficiency level of the students who left — 90% of them were below standards — then it obviously knows the proficiency level of the entire cohort and the ones who remained. What was it?

        Why do charter school “studies” never seem to want to look at the data that would give the true picture even when it is available? Either that charter school lost a disproportionate share of the below proficient students in its starting cohort or it did not. But without data, we could even be talking about 100% of the below proficient students leaving. I doubt that is the case, but purposely leaving out that data certainly makes me wonder what is being hidden in those numbers.

        The one thing that the WNYC story revealed but barely mentioned is that students leave the highest performing charter network at a much higher rate than they leave a lower performing one. That’s a fact. Ironically, KIPP used to be one of those high performing charter networks until they made a concerted effort –at least in NYC — to stop suspending and getting rid of so many students. Thus, their low attrition rate and low test scores.

        Why else would a high-performing charter school lose so many students? High performing PUBLIC schools don’t. The highest performing public schools have low attrition rates. Not extraordinarily high ones.

        Where in the CREDO study does it say that they account for attrition? Charter schools weed out the students who won’t do well in their school, period. Not all do it as aggressively as the highest scoring ones do, but since those aggressive charter schools are always included in that average, the average will always be higher.

        That is why the important comparison is the one that the charter school folks are terrified to do. To compare attrition rates among charter schools and see if their is a correlation between high attrition rates and very high performance. If there is, then the emperor has no clothes. So it will never be done.

        You think attrition doesn’t account for those high scores in charter schools? If it didn’t, high performing charter schools would rarely lose a student. Just like KIPP now rarely loses a student. And pays the price with much lower test scores.

        Like

      • “What is the percentage of all students who enrolled in that Boston charter cohort who were not proficient?”

        You’ll need to do that research yourself. I had cited the illustrative example as a “tidbit” offered by a Mass. Charter School Association. I don’t know anyone over there, but you could ping ’em with an email. You’ll find the reference in the second of these documents, both of which may be of some interest but not likely to answer all your questions.

        Click to access final_attrition_report_june_2014_0.pdf

        Click to access fact_sheet_-_charter_school_attrition_2015.pdf

        [..]
        “Where in the CREDO study does it say that they account for attrition?”

        Well there’s this from the technical appendix to the 2013 national study (TPS=traditional public school, VCR=Virtual Control Record):

        “…CREDO found suggestive evidence that charter-bound TPS students were on an accelerating negative trend in the two years before switching to a charter school, and these students had different starting achievement than other students in our analysis. Second, as noted above, CREDO limited the ‘head to head’ comparison of fixed effects and VCR methods to only students that switched from TPS to charter schools, and excluded students that move from charters to TPS. This was done because the VCR method by its construction only captures students who either switch from TPS to charter or “grow up” charter; if a charter student in our analysis switches back to TPS they are no longer followed.
        “To see if limiting our ‘head to head’ comparison to only students that switch from TPS charter could be affecting our estimation, CREDO reran our comparison of fixed effects and VCR methods, this time including students that switch between the charter and TPS sectors in either direction (as would be the case in a traditional fixed effects estimation). The results for this model were indeed closer to the overall findings from the paper. This is likely due in part to a slight downward trend among TPS achievement in our data, (also noted among VCRs in the body of the report). This implies that the exclusion of TPS records from later years, or rather the over sampling of early observations in TPS, biases down the estimated charter effect by biasing up the TPS counterfactual.”
        https://credo.stanford.edu/documents/NCSS2013_Technical%20Appendix.pdf (pg. 7)

        Can’t say I understand that thoroughly, but get the impression that they believe attrition would not explain away their results.

        “You think attrition doesn’t account for those high scores in charter schools? If it didn’t, high performing charter schools would rarely lose a student.”

        If I understand correctly you anticipate that, in general, charter schools with the highest student test scores likely have the highest attrition rates. Seems like a reasonable hypothesis, and you could go ahead and examine publicly available data to get a better sense of whether it is valid. You could build on what you know about NYC, starting with this list of Boston area charter schools with their most recent summer attrition rates.

        Academy Of the Pacific Rim Charter Public School (5-12) 8%
        Boston Preparatory Charter Public School (6-12) 8%
        Codman Academy Charter Public School (PK-12) 7.7%
        MATCH Charter Public School (PK-12) 5.5%
        Boston Collegiate Charter School (5-12) 6.9%

        Helen Y. Davis Leadership Academy Charter Public School (6-8) 11.3%
        Brooke Charter School Roslindale (K-8) 5.3%
        Brooke Charter School East Boston (K-7) 3.3%
        Brooke Charter School Mattapan (K-8) 1.6%
        Conservatory Lab Charter School PK-8 12%
        Roxbury Preparatory Charter School (5-8) 11%
        Neighborhood House Charter School (PK-8) 9.1%
        Dorchester Collegiate Academy Charter (4-8) 18.5%
        KIPP Academy Boston Charter School (K-7) 5.2%
        Boston Renaissance Charter Public School (PK-6) 10%

        My impression is that factoring in school year stability rates wouldn’t much affect the rank order (the great majority of transfers are over the summer), but if you wanted you could find those rates using this downloadable Charter Analysis and Review Tool: http://www.doe.mass.edu/charter/finance/chart/
        Or just put in a search engine the name of a school along with the word accountability and look for the first profiles.doe.mass.edu site that pops up. And then via the “Students” tab you can get to attrition, dropout and mobility rates (not quite as recent as via the Review Tool) and, via the “Accountability” tab, go to the “Complete Report Card” where you’ll find testing data. (Kipp is relatively new, which limits comparability of its data).

        BTW, while you may find some support for your hypothesis in other locales, I’m quite sure you’ll find it countered by the Boston data. Brooke Mattapan with the lowest attrition rate is top notch in respect to test scores. Dorchester Collegiate Academy, with the highest attrition rate, is scheduled to be closed in June. Not too long ago, I attended a meeting there largely consisting of parents talking with the state Commissioner of Education about why he was recommending it be closed. He stressed that the school’s attrition rate was higher and students’ test scores generally lower than other charter schools in the city, its students’ test scores barely above those typically found at traditional district schools. In response, the parents had little to say about test scores. One after another they instead talked passionately about the culture of the school, how it provided a safe, encouraging, supportive environment for their children compared to what they had experienced previously elsewhere, and pleaded that it be kept open.

        Like

      • “What is the percentage of all students who enrolled in that Boston charter cohort who were not proficient?”

        You’ll need to do that research yourself. I had cited the illustrative example as a “tidbit” offered by a Mass. Charter School Association. I don’t know anyone over there, but you could ping ’em with an email. You’ll find the reference in the second of these documents, both of which may be of some interest but not likely to answer all your questions.

        Click to access final_attrition_report_june_2014_0.pdf

        Click to access fact_sheet_-_charter_school_attrition_2015.pdf

        [..]
        “Where in the CREDO study does it say that they account for attrition?”

        Well there’s this from the technical appendix to the 2013 national study (TPS=traditional public school, VCR=Virtual Control Record):

        “…CREDO found suggestive evidence that charter-bound TPS students were on an accelerating negative trend in the two years before switching to a charter school, and these students had different starting achievement than other students in our analysis. Second, as noted above, CREDO limited the ‘head to head’ comparison of fixed effects and VCR methods to only students that switched from TPS to charter schools, and excluded students that move from charters to TPS. This was done because the VCR method by its construction only captures students who either switch from TPS to charter or “grow up” charter; if a charter student in our analysis switches back to TPS they are no longer followed.
        “To see if limiting our ‘head to head’ comparison to only students that switch from TPS charter could be affecting our estimation, CREDO reran our comparison of fixed effects and VCR methods, this time including students that switch between the charter and TPS sectors in either direction (as would be the case in a traditional fixed effects estimation). The results for this model were indeed closer to the overall findings from the paper. This is likely due in part to a slight downward trend among TPS achievement in our data, (also noted among VCRs in the body of the report). This implies that the exclusion of TPS records from later years, or rather the over sampling of early observations in TPS, biases down the estimated charter effect by biasing up the TPS counterfactual.”
        https://credo.stanford.edu/documents/NCSS2013_Technical%20Appendix.pdf (pg. 7)

        Can’t say I understand that thoroughly, but get the impression that they believe attrition would not explain away their results.

        “You think attrition doesn’t account for those high scores in charter schools? If it didn’t, high performing charter schools would rarely lose a student.”

        If I understand correctly you anticipate that, in general, charter schools with the highest scores likely have the highest attrition rates. Seems like a reasonable hypothesis and you could go ahead and examine publicly available data to get a better sense of whether it is valid. You could build on what you know about NYC, starting with this list of Boston area charter schools with their most recent summer attrition rates.

        Academy Of the Pacific Rim Charter Public School (5-12) 8%
        Boston Preparatory Charter Public School (6-12) 8%
        City on a Hill (Circuit Street) (9-12) 8.6
        Codman Academy Charter Public School (PK-12) 7.7%
        MATCH Charter Public School (PK-12) 5.5%
        Boston Collegiate Charter School (5-12) 6.9%

        Helen Y. Davis Leadership Academy Charter Public School (6-8) 11.3%
        Brooke Charter School Roslindale (K-8) 5.3%
        Brooke Charter School East Boston (K-7) 3.3%
        Brooke Charter School Mattapan (K-8) 1.6%
        City on a Hill (Dudley Square) (9-10) 12.6
        Conservatory Lab Charter School PK-8 12%
        Roxbury Preparatory Charter School (5-8) 11%
        Neighborhood House Charter School (PK-8) 9.1%
        Dorchester Collegiate Academy Charter (4-8) 18.5%
        KIPP Academy Boston Charter School (K-7) 5.2%
        Boston Renaissance Charter Public School (PK-6) 10%

        My impression is that factoring in school year stability rates wouldn’t much affect the rank order (the great majority of transfers are over the summer), but if you wanted you could find those rates using this downloadable Charter Analysis and Review Tool: http://www.doe.mass.edu/charter/finance/chart/
        If you select a school and then, for example, “Attrition” be sure to “Select Subgroup” such as “All students” to see the data, and note tabs, including “Home” on the bottom of the page.

        Or just put in a search engine the name of a school along with the word accountability and look for the first profiles.doe.mass.edu site that pops up. And then via the “Students” tab you can get to attrition, dropout and mobility rates (not quite as recent as via the Review Tool) and, via the “Accountability” tab, to the “Complete Report Card” where you’ll find testing data. (Kipp Academy and City on a Hill-Dudley are relatively new, which limits comparability of their data).

        BTW, while you may find some support for your hypothesis in other locales, I’m quite sure you’ll find it countered by Boston data. Brooke Mattapan with the lowest attrition rate is top notch in respect to test scores. Dorchester Collegiate Academy, with the highest attrition rate, is scheduled to be closed in June. Not too long ago, I attended a meeting there largely consisting of parents talking with the state Commissioner of Education about why he was recommending it be closed. He stressed that the school’s attrition rate was higher and students’ test scores generally lower than other charter schools in the city, its students’ test scores barely above those typically found at traditional district schools. In response, the parents had little to say about test scores. One after another they instead talked passionately about the culture of the school, how it provided a safe, encouraging, supportive environment for their children compared to what they had experienced previously elsewhere, and pleaded that it be kept open.

        Like

  6. BELOW IS WHAT I WROTE ORIGINALLY, CONFLATING TWO UNRELATED EVENTS. THE MOSKOWITZ LECTURE IS MARCH 29TH. ON APRIL 6TH THE MANHATTAN INSTITUTE IS SPONSORING A DISCUSSION OF CHARTER SCHOOLS AT THE HARVARD CLUB, AN EVENT THAT IS OPEN TO THE PUBLIC. I REGRET THE ERROR. “The Harvard Club of New York is, perhaps inadvertently, also helping Moskowitz. It has scheduled an evening presentation on Monday, March 29th , to be followed by a panel discussion. The blurb describing the event makes no mention of any criticism. Here’s a sample:

    Eva Moskowitz founded Success Academy Charter Schools in 2006 with the dual mission of building world-class schools for New York City children and serving as a catalyst and a national model for education reform to help change public policies that prevent so many children from having access to opportunity. Firmly believing that inner-city students deserve the same high-quality education as their more affluent peers, and convinced that all children, regardless of zip code or socioeconomic background, can achieve at the highest levels, she opened the first Success Academy in Harlem and today operates 34 schools in some of the city’s most disadvantaged neighborhoods. Success Academy continues to grow at a rapid pace and will be hiring more than 900 teachers and other personnel before the next academic year.

    After Moskowitz’s presentation, a discussion will be moderated by a ‘Senior Reporter’ from The 74, which is not a journalistic organization but an advocacy group. The panelists are James Merriman, President, New York City Charter School Center; Michael Petrilli, President, Thomas B. Fordham Institute; and Charles Sahm, Director, Education Policy, Manhattan Institute, all strong charter school advocates who have publicly supported Moskowitz and Success Academies.

    What do you suppose they will ‘debate’? How about this for a tough question: The New York Times: Threat or Menace?

    The event is open to Club members and their guests. (I cannot attend because I will be out of the country.)”

    Like

  7. John,

    Thanks for writing this story. It’s an issue that has to be kept in the spotlight.

    Your statement (below), implies that choice leads to more commitment from parents for keeping their kids in the school regardless of unexpected circumstances. Would you then say, that parents should CHOOSE which public school they send their kids to? Why should charter parents be the only parents that get to choose which school their children attend?

    On the other hand, getting into a charter school entails jumping through hoops, often a lot of them, and those parents–who have sought out what they hope to be better opportunities for their children–are not going to change schools just because of a job loss or an illness.

    Like

    • “Why should charter parents be the only parents that get to choose which school their children attend?”

      Your premise is wrong. The only charter parents with a “choice” are the ones with children who the charter school is willing to teach.

      In the days before privately operated “public” charter schools, getting rid of a kid you thought was unteachable obligated your public school to pay the very expensive private school tuition to meet the special needs you convinced his parents he had in order to get them out of your public school. Ultimately, the public school system still paid the cost.

      Nowadays, a charter school network’s financial obligation to the child ends as soon as they can make him feel enough misery that his parents pull him from the school. And the “choice” as to whether to embrace him (above average student who will learn even with inexperienced teachers) or to target him for misery is entirely up to the charter.

      That isn’t choice.

      Like

Leave a comment