UPDATE * The report discussed below is even more problematic than I thought.
The National Academies' just published an assessment of U.S. graduate research programs. Rather than compiling a single ranking, they rank programs in a number of different ways -- and also published data on the variables used to calculate those different rankings -- so you can sort the data however you like. Another aspect to like is that the methodology recognizes uncertainty and measurement error, so they actually estimate an upper-bound and lower-bound on all of the rankings (what they call the 5th and 95th percentile ranking, respectively).
Ranked, Based on Research
So how do the data come out? Here are the top programs in terms of "research activity" (using the 5th percentile rankings):
1. University of Wisconsin-Madison, Psychology
2. Harvard University, Psychology
3. Princeton University, Psychology
4. San Diego State University-UC, Clinical Psychology
5. University of Rochester, Social-Personality Psychology
6. Stanford University, Psychology
7. University of Rochester, Brain & Cognitive Sciences
8. University of Pittsburgh-Pittsburgh Campus, Psychology
9. University of Colorado at Boulder, Psychology
10. Brown University, Cognitive and Linguistic Sciences: Cognitive Sciences
Yes, it's annoying that some schools have multiple psychology departments and thus each is ranked separately, leading to some apples v. oranges comparisons (e.g., vision researchers publish much faster than developmental researchers, partly because their data is orders of magnitude faster/easier to collect; a department with disproportionate numbers of vision researchers is going to have an advantage).
What is nice is that these numbers can be broken down in terms of the component variables. Here are rankings in terms of publications per faculty per year and citations per publication:
Publications per faculty per year
1. State University of New York at Albany, Biopsychology
2. University of Wisconsin-Madison, Psychology
3. Syracuse University Main Campus, Clinical Psychology
4. San Diego State University-UC, Clinical Psychology
5. Harvard University, Psychology
6. University of Pittsburgh-Pittsburgh Campus, Psychology
7. University of Rochester, Social-Personality Psychology
8. Florida State University, Psychology
9. University of Colorado-Boulder, Psychology
10. State University of New York-Albany, Clinical Psychology
Average Citations per Faculty
1. University of Wisconsin-Madison, Psychology
2. Harvard University, Psychology
3. San Diego State University-UC, Clinical Psychology
4. Princeton University, Psychology
5. University of Rochester, Social_Personality Psychology
6. Johns Hopkins University, Psychological and Brain Sciences
7. University of Pittsburgh-Pittsburgh Campus, Psychology
8. University of Colorado-Boulder, Psychology
9. Yale University, Psychology
10. Duke University, Psychology
So what seems to be going on is that there are a lot of schools on the first list which publish large numbers of papers that nobody cites. If you combine the two lists in order to get average number of citations per year per faculty, here are the rankings. I'm including numbers this time so you can see the distance between the top few and the others. The #1 program doubles the rate of citations of the #10 program.
Average Citations per Faculty per Year
1. University of Wisconsin-Madison, Psychology - 13.4
2. Harvard University, Psychology - 12.7
3. San Diego State University-UC, Clinical Psychology - 11.0
4. Princeton University, Psychology - 10.6
5. University of Rochester, Social-Personality Psychology - 10.6
6. Johns Hopkins University, Psychological and Brain Sciences - 8.8
7. University of Pittsburgh-Pittsburgh Campus, Psychology - 8.3
8. University of Colorado-Boulder, Psychology - 8.0
9. Yale University, Psychology - 7.5
10. Duke University, Psychology - 6.9
The biggest surprise for me on these lists is that University of Pittsburg is on it (it's not a program I hear about often) and that Stanford is not.
Student Support
Never mind about the research, how do the students do? It's hard to say, partly because the variables measured aren't necessarily the ones I would measure, and partly because I don't believe their data. The student support & outcomes composite is build out of:
Percent of first year students with full financial support
Average completion rate within 6 years
Median time to degree
Percent of students with academic plans
Program collects data about post-graduation employment
That final variable is something that would only be included by the data-will-save-us-all crowd; it doesn't seem to have any direct relationship to student support or outcome. The fact that they focus on first year funding only is odd. I think it's huge that my program guarantees 5 years -- and for 3 of those we don't have to teach. Similarly, one might care whether funding is tied to faculty or given to the students directly. Or whether there are grants to attend conferences, mini-grants to do research not supported by your advisor, etc.
But leaving aside whether they measured the right things, did they even measure what they measured correctly? The number that concerns me is "percent of students with academic plans," which is defined in terms of the percentage that have lined up either a faculty position or a post-doctoral fellowship by graduations, and which is probably the most important variable of those they list in terms of measuring the success of a research program.
They find that no school has a rate of over 55% (Princeton). Harvard is at 26%. To put none to fine a point on it, hat's absurd. Occasionally our department sends out a list of who's graduating and what they are doing next. Unfortunately, I haven't saved any of them, but typically all but 1 or 2 people are continuing on to academic positions (there's often someone who is doing consulting instead, and occasionally someone who just doesn't have a job lined up yet). So the number should be closer to 90-95% -- not just at Harvard, but presumably at peer institutions.
This makes me worried about their other numbers. In any case, since the "student support" ranking is so heavily dependent on this particular variable, and that variable is clearly measured incorrectly, I don't think there's much point in looking at the "student support" ranking closely.
The National Academies' just published an assessment of U.S. graduate research programs. Rather than compiling a single ranking, they rank programs in a number of different ways -- and also published data on the variables used to calculate those different rankings -- so you can sort the data however you like. Another aspect to like is that the methodology recognizes uncertainty and measurement error, so they actually estimate an upper-bound and lower-bound on all of the rankings (what they call the 5th and 95th percentile ranking, respectively).
Ranked, Based on Research
So how do the data come out? Here are the top programs in terms of "research activity" (using the 5th percentile rankings):
1. University of Wisconsin-Madison, Psychology
2. Harvard University, Psychology
3. Princeton University, Psychology
4. San Diego State University-UC, Clinical Psychology
5. University of Rochester, Social-Personality Psychology
6. Stanford University, Psychology
7. University of Rochester, Brain & Cognitive Sciences
8. University of Pittsburgh-Pittsburgh Campus, Psychology
9. University of Colorado at Boulder, Psychology
10. Brown University, Cognitive and Linguistic Sciences: Cognitive Sciences
Yes, it's annoying that some schools have multiple psychology departments and thus each is ranked separately, leading to some apples v. oranges comparisons (e.g., vision researchers publish much faster than developmental researchers, partly because their data is orders of magnitude faster/easier to collect; a department with disproportionate numbers of vision researchers is going to have an advantage).
What is nice is that these numbers can be broken down in terms of the component variables. Here are rankings in terms of publications per faculty per year and citations per publication:
Publications per faculty per year
1. State University of New York at Albany, Biopsychology
2. University of Wisconsin-Madison, Psychology
3. Syracuse University Main Campus, Clinical Psychology
4. San Diego State University-UC, Clinical Psychology
5. Harvard University, Psychology
6. University of Pittsburgh-Pittsburgh Campus, Psychology
7. University of Rochester, Social-Personality Psychology
8. Florida State University, Psychology
9. University of Colorado-Boulder, Psychology
10. State University of New York-Albany, Clinical Psychology
Average Citations per Faculty
1. University of Wisconsin-Madison, Psychology
2. Harvard University, Psychology
3. San Diego State University-UC, Clinical Psychology
4. Princeton University, Psychology
5. University of Rochester, Social_Personality Psychology
6. Johns Hopkins University, Psychological and Brain Sciences
7. University of Pittsburgh-Pittsburgh Campus, Psychology
8. University of Colorado-Boulder, Psychology
9. Yale University, Psychology
10. Duke University, Psychology
So what seems to be going on is that there are a lot of schools on the first list which publish large numbers of papers that nobody cites. If you combine the two lists in order to get average number of citations per year per faculty, here are the rankings. I'm including numbers this time so you can see the distance between the top few and the others. The #1 program doubles the rate of citations of the #10 program.
Average Citations per Faculty per Year
1. University of Wisconsin-Madison, Psychology - 13.4
2. Harvard University, Psychology - 12.7
3. San Diego State University-UC, Clinical Psychology - 11.0
4. Princeton University, Psychology - 10.6
5. University of Rochester, Social-Personality Psychology - 10.6
6. Johns Hopkins University, Psychological and Brain Sciences - 8.8
7. University of Pittsburgh-Pittsburgh Campus, Psychology - 8.3
8. University of Colorado-Boulder, Psychology - 8.0
9. Yale University, Psychology - 7.5
10. Duke University, Psychology - 6.9
The biggest surprise for me on these lists is that University of Pittsburg is on it (it's not a program I hear about often) and that Stanford is not.
Student Support
Never mind about the research, how do the students do? It's hard to say, partly because the variables measured aren't necessarily the ones I would measure, and partly because I don't believe their data. The student support & outcomes composite is build out of:
Percent of first year students with full financial support
Average completion rate within 6 years
Median time to degree
Percent of students with academic plans
Program collects data about post-graduation employment
That final variable is something that would only be included by the data-will-save-us-all crowd; it doesn't seem to have any direct relationship to student support or outcome. The fact that they focus on first year funding only is odd. I think it's huge that my program guarantees 5 years -- and for 3 of those we don't have to teach. Similarly, one might care whether funding is tied to faculty or given to the students directly. Or whether there are grants to attend conferences, mini-grants to do research not supported by your advisor, etc.
But leaving aside whether they measured the right things, did they even measure what they measured correctly? The number that concerns me is "percent of students with academic plans," which is defined in terms of the percentage that have lined up either a faculty position or a post-doctoral fellowship by graduations, and which is probably the most important variable of those they list in terms of measuring the success of a research program.
They find that no school has a rate of over 55% (Princeton). Harvard is at 26%. To put none to fine a point on it, hat's absurd. Occasionally our department sends out a list of who's graduating and what they are doing next. Unfortunately, I haven't saved any of them, but typically all but 1 or 2 people are continuing on to academic positions (there's often someone who is doing consulting instead, and occasionally someone who just doesn't have a job lined up yet). So the number should be closer to 90-95% -- not just at Harvard, but presumably at peer institutions.
This makes me worried about their other numbers. In any case, since the "student support" ranking is so heavily dependent on this particular variable, and that variable is clearly measured incorrectly, I don't think there's much point in looking at the "student support" ranking closely.
1 comment:
I wouldn't put to much stock into having not heard much of a program. Reputations do not spread evenly like a gas. You hear more about certain programs depending on what program you are in, and what area of research you are involved in. For example, have any idea what the top I/O program is? (I don't). The view of the landscape isn't the same everywhere.
This is part of what the biggest issue with these, and all rankings is. Much of these comparisons are apples and oranges. Even taking out the issue of multiple programs (commdis, cogsci, edpsych, etc) there is a lot of variation in the type of research focused on in psych departments.
Post a Comment