Dan Goldhaber is a scholar at University of Washington. Via the NCTQ blog, I just read an interesting paper he wrote with his colleague Stephanie Liddle. (Disclosure: He and I are co-writing a chapter with Susanna Loeb about the future of “human capital” for a Harvard projecty thing).
The image I’ve pictured is a list of Ed Schools in the state of Washington. Dan is examining: which institutions seem to generate better teachers?
Or, more precisely, which Ed Schools generate teachers who — when you control for a million things, like where they go on to teach after graduation, the demographics of their students, the entering SAT scores of the teacher candidates, and so on — get elementary school children to learn more math and English, as measured by value-added gains?
The authors write:
There is considerable variation in the effectiveness of teachers who come from different teacher training programs in Washington. Estimates of the variation in the effectiveness of graduates from different training programs show, for instance, that a program credentialing teachers whose average effectiveness places them at the 84th percentile versus the mean is roughly equivalent to a reduction in class size of between five to ten students.
The point here isn’t that Dan’s list is necessarily “right.” What I mean is, while there is a heroic mathematical effort here to fairly judge the Ed Schools, it’s just a first batch of data, and likely to move around over time.
The point here, in my opinion, is that they’re trying to take a first step towards asking and answering the question: which Ed Schools do a better job of preparing teachers. Few are asking, even fewer are answering.
My hope is that if these types of studies can grow, with scholars taking different empirical approaches, a list of successful teacher prep institutions would emerge. And once we know that, other folks can examine what might be learned from best-in-breed.
Barring studies like these, we have popularity lists. For example, US News & World Reports rates University of Washington’s Ed School as #9 in the nation. None of the other institutions in Washington state made the top 100.
But those rankings are not in any way based on how their teacher alumni ultimate do in classrooms, with real-life kids, and real-life colleagues.
US News rankings are based on things like library collections, publications by professors, SATs of grad students, opinions of other Ed School deans who have never actually visited the UW campus. I.e., inputs mostly. Stuff we really don’t care about very much. Dan’s data shows that the proud teacher alums of, say, University of Puget Sound (never heard of it) are at least the equals of UW grads, at least with the first batch of data. And how about that Walla Walla!
So while UW issues press releases extolling its magazine rankings, I think UPS and WW should challenge UW to a “teach-off,” to see who really is better at minting teachers….