«

»

Oct 09

Print this Post

#1 Rice, #2 Southern Methodist University…Another Problematic College Ranking

The Chronicle of Higher Education reports on a new university ranking, the Faculty Media Impact Project, which purports to measure “the degree to which faculty share their research with the broader public” relative to the amount of public funding received. Conducted by the Center for a Public Anthropology (“the Center”), the study quantifies professors’ impact in the media using citation counts generated from queries of Google News. Professors’ individual counts are then averaged by university and social science department (e.g. Economics, Anthropology). Finally, the average citation count is divided by the academic unit’s share of total NSF funding to the social sciences. The top five universities according to this ranking methodology are:

  1. Rice University
  2. Southern Methodist University
  3. MIT
  4. University of Texas – San Antonio
  5. University of Arkansas

Anyone familiar with mainstream college rankings (e.g. U.S. News & World Report) will probably be puzzled by the above list. With the exception of MIT and Rice University to a lesser extent, the above list is topped by universities that aren’t particularly prestigious or highly-regarded for their research. Where are the Harvards, Stanfords, and other universities that typically grace the top of college rankings? They received lower rankings by this metric because professors at the most prestigious universities, while cited in the media most frequently, also tend to receive a disproportionate amount of NSF funding. For example, Harvard tops the list of average citations per professor with 13.93; however, these same professors also receive lots of NSF funding (1.3% of the total NSF funding).

Blogger and Professor of Political Science, Erik Voeten, criticizes this approach, arguing that “…by dividing easily interpretable information (average media citations by faculty) by a fairly arbitrary number, the rankings obscure rather than enlighten.” I agree with Dr. Voeten and have some additional concerns. By design, these rankings champion the professor who is cited extensively in the media but receives zero NSF funding. Though exceptions exist, NSF funding is generally quite difficult to obtain and highly prestigious, meaning that the NSF recipients are typically (though not always) higher caliber researchers relative to their unfunded counterparts. Bearing the distinction between NSF funded/non-funded professors in mind, it seems to me that the unfunded professor glorified by the Center’s rankings is not necessarily the academic the public benefits most from hearing from, all else equal.

For the purposes of the foregoing discussion, let’s temporarily accept that the ratio of media citations and public funding is somehow an important and meaningful measure of the “return” on publicly-funded research. Even on its own terms, however, the ratio is calculated so poorly that the resulting rankings are nonsensical. To begin with, the Center operationalizes (i.e. conceptualizes and measures) one of its key concepts of public funding rather carelessly. As already stated, public funding is measured by the academic unit’s share of total NSF funding allocated to research in the social sciences. The most obvious problem is that most public funding for professors, especially within the social sciences, does not come from the NSF. Professors at public universities (e.g. #4 UT – San Antonio and #5 U. Arkansas) receive significant state and federal dollars that have nothing to do with the NSF. Even faculty at private universities receive funding from public grants aside from the NSF. A better approach, though more difficult, would tabulate total public dollars spent on a professor’s personal compensation or research. A symptom of the Center’s poor operationalization can be observed in cases when a university’s faculty didn’t receive any NSF funding. The return on public spending in these cases is supposedly infinite, or near infinite based on the Center’s flawed approach of adding .01 to the funding amount used in denominator. The true return cannot be calculated accurately in the absence of a better estimate of public funding.

The study’s measure of citations is also flawed for a number of reasons. First, there is no distinction as to whether a professor is cited as a knowledgeable source or their work is being criticized. Another flaw is one the Center’s documentation readily discloses, which is that the search queries almost certainly misattributed certain citations to faculty with common names. For example, the documentation mentions Adam Smith, an anthropologist at University of Chicago whose citation count is inflated by the more famous and regularly referred to Scottish economist. The Center asserts that “…there is a ready solution to this difficulty. The Center would appreciate individuals with this problem to write the Center indicating which citations are falsely attributed to them. The inappropriate citations will be deleted from their scores.” The notion that faculty will even be aware of this request, let alone choose to perform their own personal research to lower their score or then appropriately report the accurate number, is obviously a bit farfetched. For this reason, the inaccurate citation counts will likely remain so. Taken together, the rankings do a poor job at quantifying the construct of interest: public engagement per public dollar spent.

About the author

Benjamin Bohr

Benjamin is a manager at Fulcrum Financial Inquiry. He specializes in statistics and data analysis for use in high-stakes litigation, business valuations, and other financial matters.

Permanent link to this article: http://betweenthenumbers.net/2013/10/1-rice-2-southern-methodist-universityanother-problematic-college-ranking/

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>


*