TEACHER VAM SCORES RELEASED TO PUBLIC
take action: ask your legislators to halt the use of VAM data and high-stakes testing until the system is fair, valid, and comprehensible
Feburary 24, 2014
I know that this is a worrisome day, and that you have heard from Superintendent LeRoy and the FL-DOE this morning regarding the release of your VAM data to the media. Please know that our state Association, the Florida Education Association (FEA), fought to keep this data from being released to the public, but the courts ruled against us. Our main concern remains that fact that this data is deeply flawed. FL-DOE has a poor track record with its data calculations regarding test score levels and school grades. We likewise have no confidence in its use of these complex calculations to assess a teacher’s true performance. As the Florida education commissioner recently stated regarding school grades: “It needs to be simpler so that everyone’s confidence is back in the . . . system.”
What is being released to the media is the value added score, as calculated by the FL-DOE, for the ’10-11, ’11-12, and ’12-13 school years. What is not being released is your Journey data or your EPC information. The data released shows an incomplete and, we believe, inaccurate picture of teacher performance across the district and across the state!
The VAM numbers assigned to teachers are based upon the formula below. It is inappropriate to attempt to determine the effectiveness of a teacher using a formula that is comprehensible only to a small number of statisticians and that cannot be independently verified.
Nearly all teachers’ VAM numbers are calculated according to students’ FCAT scores, yet only approximately 30 percent of teachers in Florida teach students and subjects tested on the FCAT. So for 70 percent or more of teachers, the VAM does not even attempt to measure the teacher’s actual teaching.
The Legislature openly recognized this flaw in passing SB 1664 in 2013, which requires future VAM scores to be based upon a teacher’s actual students (although not necessarily based upon the subject taught by the teacher). SB 1664, while addressing the concern in name did not address the concern in reality. It does not address teachers being evaluated on the actual subject they teach or the cost of the unfunded mandate of producing in producing the testing these calculations would require. It was merely a kick of the can down the road so that, when questioned, they could say this was the district’s fault because they determined what to use. The point is everyone is following legislative and FL-DOE mandates that were and remain erroneous at best and harmful at worst. This problem is compounded by the fact that the historical VAM data does not take into account the new law, making the data even more worthless.
Most researchers agree that VAM is not appropriate as a primary measure for evaluating individual teachers. Reviews of research on value-added methodologies for estimating teacher “effects” based on student test scores have concluded that these measures are too unstable and too vulnerable to many sources of error to be used for teacher evaluation. A given teacher may appear to have differential effectiveness from class to class, from year to year, and from test to test. Ratings are most unstable at the upper and lower ends of the scale, where the ratings are most likely to be used to determine high or low levels of effectiveness. Even where the VAM is calculated based upon the teacher’s students and subject taught, it purports to measure 179 days of instruction upon a few hours of testing. Standardized tests only assess a fraction of what teachers teach and students learn. We wouldn’t dream of reporting or evaluating student performance based upon a single measurement; why should we do so for teachers?
Florida’s VAM formula has additional unique flaws because of the way in which it has been implemented. One-third of every teacher’s VAM score is based upon the overall performance of the school, over which the teacher has no individual control. Many of the calculations include test scores of students the teacher did not teach for the full school year. There are very high error rates associated with many of the scores.
That is why in these prior years we have not used VAM data for teacher evaluations. Learning gains show the academic progress a student has made. Learning gains are understood and can be substantiated. We have chosen to use learning gains based on FCAT data this past year in Polk to stay in compliance with Florida law. This year the learning gains option has been removed, but, we are working diligently to find a formula that will work for our teachers and comply with Senate Bill 1664.
PEA fully supports teacher accountability. No one wants an ineffective teacher in the classroom. But assessments of teachers, like assessments of students, must be valid, transparent and multi-faceted. These value-added model calculations are none of these. We hope that anyone who publishes these numbers makes it fully clear to its readers how little meaning these numbers have in determining the quality of an individual teacher. This is why FEA joined in to enforce the public records exemption and prevent the “value-added model” or “VAM” data from being published in the first place.
After nearly four years, millions of dollars spent on the formula and countless hours spent in implementing SB 736, these flawed and meaningless calculations are all the DOE has to show for its efforts. Floridians can now see what this effort has accomplished for Florida’s students: Nothing. It’s an expensive boondoggle that is taking money from the classroom. Our teachers deserve better and so do our students.
Contact your Legislator at the FEA Action Center tell them why VAM is a bad idea.