RANKING, SPANKING: SETH ABRAMSON RESPONDS

In a recent blog post, James McGirk wondered whether it was possible to rank writing programmes. What metric would we use? He considered the controversy over Seth Abramson's ranking of the top-50 post-graduate writing programmes for 2010 (published in the Nov/Dec 2009 issue of Poets & Writers magazine), and came away suspecting that Abramson was perhaps "a better poet than statistician". Seth Abramson took issue with quite a few aspects of McGirk's post. Here is his response in full (which spam barriers prevented him from filing as a comment; where were those filters for all those folks trying to sell cheap Ugg boots?): James, Just to correct some things you wrote: U.S. News & World Report last gathered data on graduate creative writing programs in 1995 (and last published original findings in 1997); the sole data-point used by U.S. News for its ranking was a one-question "faculty survey" on the question of national reputation (not an independent assessment of "national reputation" and *also* a "faculty survey," as you wrote); the 1997 U.S. News rankings were *not* exclusively rankings of full-residency MFA programs, as the 2010 P&W rankings were (the former included dozens of M.A., Ph.D., and low-residency programs also); in 1995 there were approximately 60 MFA programs in the U.S., whereas there were more than 140 when the P&W rankings were done (meaning, for instance, that Columbia's #22 ranking in 2010 would have been the equivalent of a #10 ranking in 1997); and finally, of the top 36 full-residency MFA programs in 1995, the 2010 P&W top 36 contained 26 of the same schools–-and the 10 previously-top-36 schools that fell outside the top 36 in 2010 included George Mason (#37), Boston University (#38), Ohio State (#40), Maryland (#41), Florida State (#42), and Penn State (#46). Seen in this light I'm not sure how you could say that the 2010 rankings were much of a surprise to anyone-indeed, given that there were 233% more programs in 2010 than there were in 1995, all of the programs I've listed above as ranking "outside the top 36 in 2010" actually should be treated as having placed (in 1997 terms) well inside of it. The few schools that fell precipitously in the fifteen years between 1995 and 2010--and I think we can agree that one would expect some movement in a decade and a half?--all share a common feature: their lack of funding for students was well-publicized across a fifteen-year period (e.g., Utah, Emerson, and Pittsburgh). So statistically, the 2010 rankings showed an incredibly high degree of similarity to the 1997 rankings, contrary to your claim, except that certain schools moved up or down slightly (e.g., your own program, Columbia, dropped slightly from being #4 in 1997 to #10 [in 1997 equivalency] in 2010). I think it's important your readers know several additional facts: that the 2007 article in The Atlantic you cite as constituting the "first official ranking" of MFA programs since 1997 in fact says (if anyone cares to read the article itself) that it is not a ranking, but a non-exhaustive series of "lists" of strong programs; the focus of these "lists" and the article that accompanied them was on fiction programs (the article appeared in The Atlantic's annual Fiction Issue) whereas the P&W rankings were multi-genre, one reason that Irvine, Columbia, and Boston University fared less well (they have never been as popular among poets as fiction-writers, and poets made up a substantial percentage of the polling sample in 2010); the AWP, which you reference opposing the 2010 P&W rankings, opposed--with equal force and equally publicly--both the 1997 U.S. News & World Report rankings and the 2007 Atlantic "lists," as the AWP has a stated policy of opposing any rankings of MFA programs (and your failure to note this wrongly implies that AWP singled out my rankings as particularly suspect); if you've ever interviewed any MFA faculty or MFA graduates outside of New York City, you'll find that many found the 2010 P&W rankings very much in line with their own expectations and experiences, as I heard from the countless MFA directors and MFA alumni I was in touch with between 2007 and 2009 (on a separate note, more than 40 of the programs ranked by P&W have since published articles touting their placement in the new rankings); and finally, you fail to note that, in the three years I collected data on MFA programs, no program was more public in its opposition to the P&W rankings--not anywhere in the United States--than your alma mater [Columbia University], an almost entirely unfunded program in one of the most expensive locales in the country. Your suggestion that anyone who wants to avoid $100,000+ in personal debt for a non-marketable, non-professional graduate degree is somehow not up to being a writer should probably be read, then, through the lens of your own financial decision in attending Columbia. Most applicants these days believe that creative writers shouldn't have to pay for their terminal degree--no more than doctoral students (for whom the Ph.D. is the terminal degree) do. I also challenge you to search the web and try to find *specific criticisms* of any program's individual ranking in the 2010 P&W article. Other than Stacey's methodological criticism, I couldn't find anyone--nor could P&W--who actually thought the P&W rankings looked wrong, or funny, or "off" (my word), as you've implied was the general opinion on the web in your blog-post. Where's your sourcing for that? Perhaps the lack of consternation about the rankings themselves--rather than the larger fact that any rankings were done at all, which certainly did raise some ire among supporters of unfunded programs--is attributable to the fact that, as I indicated above, there wasn't much that was very surprising about the rankings, in the final accounting? One MFA director I spoke to (the director at a top 10 program) said that other than UMass being ahead of Cornell, and Columbia being outside the top 20, there wasn't a single placement in the top 25 that looked at all odd to him. You might also have noted that I ended my blog because of some quite serious veiled (and not-so-veiled) personal and professional threats from detractors of the rankings--not because I don't stand by my research, as I made clear in a follow-up response to letters written to the P&W Editor in the January/February issue of Poets & Writers. That the one blogger you cite as criticizing my methodology (Stacey Harwood) is also married to the head of the largest unfunded MFA program in the United States--which was publicly extremely displeased with its placement in the P&W rankings--probably also should have been noted by a journalist who's written for The Economist [EDITOR'S NOTE: More Intelligent Life is published by The Economist Group, but distinct from The Economist]. If you'd contacted me before penning this article I could have corrected you on all of these points and many more of an even more esoteric nature. My understanding is that in journalism it's common to contact the subject of an article before going to publication. There is no better proof that blogging isn't journalism, then, than the post above; in contrast, my article in P&W was fact-checked and meticulously researched over a period of three years--and contained fifteen categories of data for fifty programs and a further ranking of eighty-eight additional programs online. What I read above I imagine you typed out in a few minutes or so--that's the problem with the internet, isn't it. Seth Abramson