Think Tank Advocacy Reports Not Credible for Education Policy: SC Edition
The Palmetto Promise Institute‘s report authored by Adam Crain, Money doesn’t translate into student results, is a follow-up to their 2013 report also comparing South Carolina education to Florida education reform.
Although this report offers several charts detailing an analysis of SC and FL National Assessment of Educational Progress (NAEP) tests data (some of which is aggregated by race, disabilities, and poverty, but focusing on 4th grade reading), the report proves to be overly simplistic and an incomplete picture of student achievement in both states — with the ham-fisted data analysis serving as a thin veneer for advocacy unsupported by valid research and a more nuanced analysis of data.
In short, this report proves to be significantly inadequate evidence to support the ideologically-driven recommendations offered at the end — recommendations this conservative think tank would make regardless of the evidence (mostly a mishmash of school choice policy). There simply is no credible link between the shallow analysis of SC/FL NAEP scores and the call for policy as solutions to the manufactured problems.
Let me outline here both the flaws of the data analysis and then the folly of the recommendations.
The foundational flaw of both reports is suggesting some sort of value in comparing SC to FL and the persistent but discredited claim that FL has successful education reform. In fact, the so-called Florida “miracle” has been strongly refuted, notably its grade-retention policy based on high-stakes test scores.
By comparison, SC is slightly more impoverished than FL, and SC (27%) has a higher percentage than FL (16%) of blacks (both metrics used in the report analysis). However, this report from PPI makes no effort to show how their raw comparisons are actually apples-to-apples, or valid.
Another analysis of NAEP data that adjusts for factors impacting test scores reveals a much more nuanced and important picture, one that exposes a huge flaw with the FL model of reform  and depending on test data.
While adjusted trend data on NAEP continues to show 4th grade FL reading scores better than SC scores, by 8th grade (see Table 6B1, 2013 data) SC (269.5) and FL (272.3) have nearly identical adjusted scores.
Here is a key point about FL’s retention policy: Retaining students can inflate short-term test data, but those gains erode over time. Further, grade retention  maintains a strong correlation with students dropping out of school and an inverse correlation with students receiving a diploma (see Jasper, 2016 ).
Ultimately, the data analysis and charts in this report are overly simplistic on purpose because PPI has an agenda: argue against increased school funding and promote school choice.
The report uses bold face, lazy math, and insufficient statistical methods to dramatize a baseless claim: “Simple funding comparisons indicate quite the opposite. Over the twelve year period between 1999 and 2011, South Carolina spent a total of $6,920 more per student, or an average of $692 per year.”
Without proper statistical analysis, using controls and making causal claims, this raw data approach, like the NAEP analysis, means almost nothing.
The body of educational research, in fact, shows that funding does matter (see Baker, 2016) .
Both, then, the NAEP analysis and the related argument that SC school funding is somehow excessive/wasteful are statistically inadequate and useless for making the recommendations at the end of the report.
Those recommendations fall into two broad categories: accountability and school choice.
SC and FL jumped on the accountability bandwagon early, about three decades ago, and remain completely unsatisfied with their educational outcomes, despite huge amounts of tax dollars and immeasurable time spent on ever-new standards and ever-new high-stakes tests.
Calling for accountability ignores the research base that shows accountability based on standards and testing has failed, will continue to fail:
There is, for example, no evidence that states within the U.S. score higher or lower on the NAEP based on the rigor of their state standards. Similarly, international test data show no pronounced tests core advantage on the basis of the presence or absence of national standards. Further, the wave of high-stakes testing associated with No Child Left Behind (NCLB) has resulted in the “dumbing down” and narrowing of the curriculum….
As the absence or presence of rigorous or national standards says nothing about equity, educational quality, or the provision of adequate educational services, there is no reason to expect CCSS or any other standards initiative to be an effective educational reform by itself. (Mathis, 2012)
The evidence on school choice also contradicts the report because choice fails to increase student achievement, but it is strongly associated with increasing segregation and inequity (see here and here).
Let’s summarize the major points of the report:
- The report claims SC lags FL in academic achievement and education reform while spending more per pupil. However, the analysis offered here is an incomplete picture and statistically flawed. None of the claims made in the report are proven, and more nuanced and longitudinal analyses of NAEP greatly erode the premise of PPI’s report (grounded also in the debunked Florida “miracle” claim).
- The report’s major recommendations about school funding, accountability, and school choice are all strongly contradicted by the research base, which the report fails to acknowledge.
Ultimately, as a colleague responded when I shared this report, PPI has published “a five page Op-Ed with bar graphs,” and I would add, not a very good one at that.
SC should in no way be influenced by this report when making education policy.
However, SC should heed a kernel the report’s conclusion: “The disparity between the stewardship of resources in Florida and our struggling education system in South Carolina is apparent.”
As I have detailed, while most educational rankings and comparisons prove to be hokum, what evidence from our schools and reform policies shows is that SC ranks first in political negligence.
Ironically, this report is calling for more negligence in the pursuit of market ideology.
 See the National Council of Teachers of English’s Resolution on Mandatory Grade Retention and High-Stakes Testing:
Grade retention, the practice of holding students back to repeat a grade, does more harm than good:
• retaining students who have not met proficiency levels with the intent of repeating instruction is punitive, socially inappropriate, and educationally ineffective;
• basing retention on high-stakes tests will disproportionately and negatively impact children of color, impoverished children, English Language Learners, and special needs students; and
• retaining students is strongly correlated with behavior problems and increased drop-out rates.
 Jasper’s abstract captures the ultimate failure of FL’s reform:
In 2003–2004 approximately 23,000 third graders were retained in Florida under the third grade retention mandate outlined in the A+ Plan. Researchers in previous studies found students who were retained faced difficulty in catching up to their peers, achieving academically, and obtaining a high school diploma (Anderson, Jimerson, & Whipple, 2005; Andrew, 2014; Fine & Davis, 2003; Jimerson, 1999; Moser, West & Hughes, 2012; Nagaoka, 2005; and Ou & Reynolds, 2010). In this study I examined educational outcomes of students retained in a large southwest Florida school district under the A+ Plan in 2003–2004. I used a match control group, consisting of similarly nonretained students, who scored at level one on the Grade 3 Reading FCAT. I then compared the control group to the retained group. I also compared achievement levels on the Grade 10 Reading FCAT of the retained and non-retained group. I evaluated longitudinal data, for both the retained and non-retained students, and found 93% of the retained students continued to score below proficiency (below a level 3) seven years after retention on the Grade 10 Reading FCAT as compared with the 85.8% of the non-retained students. I also compared standard diploma acquisition of the retained and non-retained group. The non-retained group was 14.7% more likely to obtain a standard high school diploma than the retained group. Finally, I used data from previous studies to extrapolate economic outcomes.
 Baker’s analysis has key points detailed in the Executive Summary (p. i):