Saturday, January 30, 2010

Comparing Schools to Each Other

We wouldn't do it to our students, so why would New York school board officials do it to their schools. In today's New York Times , Jennifer Medina writes about how the school boards have given high grades to 97 percent of elementary schools. This has led to the school board changing the way they evaluate schools so that there will always be a certain amount of schools getting an "A" and a certain amount of schools getting an "F". We don't evaluate our students by comparing them to each other, why do this to schools?

I the article, Medina said Shael Polakow-Suransky, the chief accountability officer is quoted in an interview as saying “We want to be able to really show how much value a school is actually adding.” This just sounds like marketing-language. The term "value-added" reminds me of how companies include "valuable trial software" on a USB drive. Like I really want to pay for a product that has trial software I didn't ask for!

There is a real danger in running schools like a business. There is only a finite amount of money to go around, but to say you will hand out funding like bonuses get handed out at Christmas to the top sales people does not create incentives. It creates more "have" and "have-not" schools.

I don't have a solution to improving performance in schools other than keep giving teachers at the grassroots levels what they need. Also, quit trying to measure the value of an individual school by test scores. The tests change frequently. What value a school adds to a community cannot be measured in its entirety on paper. Changing the hearts and minds of children for the better cannot be measured by a single score.

The big questions we should keep asking ourselves are "Are we giving kids the tools to be successful tomorrow?" and "What are the variety of ways we can measure success in preparing our youth for the future?"

No comments: