Field of Science

Showing posts with label graduate school. Show all posts
Showing posts with label graduate school. Show all posts

NSF fellows can teach again

I reported last month that NSF was no longer allowing its graduate fellows to teach. According to an email I received earlier today, they are reconsidering the issue:


Each Fellow is expected to devote full time to advanced scientific study or work during tenure. However, because it is generally accepted that teaching or similar activity constitutes a valuable part of the education and training of many graduate students, a Fellow may undertake a reasonable amount of such activities, without NSF approval. It is expected that furtherance of the Fellow's educational objectives and the gain of substantive teaching or other experience, not service to the institution as such, will govern these activities. Compensation for such activities is permitted based on the affiliated institution’s policies and the general employment policies outlined in this document.

Graduate School Rankings

There have been a number of interesting posts in the last few days about getting tenure (1, 2, 3). One thing that popped out at me was the use of the National Research Council graduate school rankings in this post. I am surprised that these continue to be cited, due to the deep flaws in the numbers. Notice I said "numbers", not "methodology". I actually kind of link their methodology. Unfortunately, the raw numbers that they use to determine rankings are so error-ridden as to make the rankings useless.

For those who didn't see my original posts on the subject, cataloging the errors, see here and here.

Apply to Graduate School?

Each year around this time, I try to post more information that would be of use to prospective graduate students, just in case any such are reading this blog (BTW Are there any undergraduates reading this blog? Post in the comments!).

This year, I've been swamped. I've been focusing on getting a few papers published, and most of my time for blogging has gone to the Scientific-American-Mind-article-that-will-not-die, which, should I ever finish it, will probably come out early next year.

Luckily, Female Science Professor has written a comprehensive essay on The Chronicle of Higher Education about one of the most confusing parts of the application process: the pre-application email to a potential advisor. Everyone tells applicants to send such emails, but nobody gives much information about what should be in them. Find the essay here.

I would add one comment to what she wrote. She points out that you should check the website to see what kind of research the professor does rather than just asking, "Can you tell me more about your research," which comes across as lazy. She also suggests that you should put in your email whether you are interested in a terminal master's. Read the website before you do that, though, since not all programs offer terminal master's (none of the programs I applied to do). Do your homework. Professors are much, much busier than you are; if you demonstrate that you are too lazy to look things up on the Web, why should they spend time answering your email?

---
For past posts on graduate school and applying to graduate school, click here.

A Frog at the Bottom of a Well

My college had a graduate admissions counselor, with whom I consulted about applying to graduate school. Unfortunately, different fields (math, chemistry, literature, psychology) use completely different methods of selecting graduate students (and, in some sense graduate school itself is a very different beast depending on the field). My counselor didn't know anything about psychology, so much of the information I was given was dead wrong.

My graduate school also provides a lot of support for applying for jobs. This week, there is a panel on "The View from the Search Committee," which includes as panelists professors from Sociology, Romance Language & Literatures, and Organismic and Evolutionary Biology. That is, none of them are from Psychology. I do know that different fields recruit junior faculty in very different ways (for instance, linguistics practices a form of speed-dating at conferences as a first round of interviews, while others psych has no such system).

So...do I go? Keep in mind that I get lots of advice from faculty in my own department (and also from friends at other psych departments who have recently gone through the process). That is, how likely is it that the experience of these three professors will map on to the process I will actually go through? How likely is it that a one-hour panel can cover all the different variants of the process? How likely is it that there is information that would be relevant to anyone applying to any department that isn't obvious or something I am likely to already know?

Thoughts?

--------
The title of this post comes from an old proverb about a frog sitting at the bottom of a well, thinking that the patch of blue above is the whole world. Often (always?) we don't realize just how limited our own range of experience is.
photo: e_monk

New Grad School Rankings Don't Pass the Smell Test

The more I look at the new graduate school rankings, the more deeply confused I am. Just after publishing my last post, it suddenly dawned on me that something was seriously wrong with the number of publications per faculty data. Looking again at the Harvard data, the spreadsheet claims 2.5 publications per faculty for the time period 2000-2006. I think this is supposed to be per faculty per year, though the it's not entirely clear. As will be shown below, there's no way that number can be correct.

First, though, here's what the report says about how the number was calculated:
Data from the Thompson Reuters (formerly Institute for Scientific Information) list of publications were used to construct this variable. It is the average over seven years, 2000-2006, of the number of articles for each allocated faculty member divided by the total number of faculty allocated to the program. Data were obtained by matching faculty lists supplied by the programs to Thompson Reuters and cover publications extending back to 1981. For multi-authored articles, a publication is awarded for each author on the paper who is also on a faculty list. 
For computer science, refereed papers from conferences were used as well as articles. Data from résumés submitted by the humanities faculty were also used to construct this variable. They are made up of two measures: the number of published books and the number of articles published during the period 1986 to 2006 that were listed on the résumé. The calculated measure was the sum of five times the number of books plus the number of articles for each allocated faculty member divided by the faculty allocated to the program. In computing the allocated faculty to the program, only the allocations of the faculty who submitted résumés were added to get the allocation.
The actual data

I took a quick look through the CVs of a reasonable subset of faculty who were at Harvard during that time period. Here are their approximate publications per year (modulo any counting errors on my part -- I was scanning quickly). I should note that some faculty list book chapters separately on their CVs, but some do not. If we want to exclude book chapters, some of these numbers would go down, but only slightly.

Caramazza 10.8
*Hauser 13.6
Carey 4.7
Nakayama 5.9
Schacter 14.6
Kosslyn 10.3
Spelke 7.7
Snedeker 1.1
Wegner 2.3
Gilbert 4.0


One thing that pops out is that people doing work involving adult vision (Caramazza, Nakayama, Kosslyn) publish a lot more than developmental folk (Carey, Spelke, Snedeker). The other thing is that publication rates are very high (except for my fabulous advisor, who was not a fast publisher in her early days, but has been picking up speed since 2006, and Wegner, who for some reason in 2000-2002 didn't publish any papers).

What on Earth is going on? I have a couple hypotheses. First, I know the report used weights when calculating composite scores for the rankings, so perhaps 2.5 reflects a weighted number, not an actual number of publications. That would make sense except that nothing I've found in the spreadsheet itself, the description of variables, or the methodology PDF supports that view.

Another possibility is that above I accounted for only about 1/4-1/3 of the faculty. Perhaps I'm over-counting power publishers. Perhaps. But unless the people I left off this list weren't publishing at all, it would be very hard to get an average of 2.5 publications per faculty per year. And I know I excluded some other power publishers (Cavanagh was around then, for instance).

A possible explanation?

The best explanation I can think of is that the report actually is including a bunch of faculty who didn't publish at all. This is further supported by the fact that the report claims that only 78% of Harvard faculty had outside grants, whereas I'm pretty sure all professors in our department -- except perhaps brand new ones who are still on start-up funds -- have (multiple) outside grants.

But there are other faculty in our department who are not professors and do not (typically) do (much) research -- and thus do not publish or have outside grants. Right now our department lists 2 "lecturers" and 4 "college fellows." They're typically on short appointments (I think about 2 years). They're not tenure track, they don't have labs, they don't advise graduate students, and I'm not even sure they have offices. So in terms of ranking a graduate program, they're largely irrelevant. (Which isn't a slight against them -- I know two of the current fellows, and they're awesome folk.)

So of 33 listed faculty this year, 6 are not professors with labs and thus are publishing at much lowered rates (if at all) and don't have outside grants. That puts us in the ballpark of the grant data in the report (82%). I'm not sure if it's enough to explain the discrepancy in publication rates, but it certainly would get us closer.

Again, it is true that the lecturers and fellows are listed as faculty, and the report would be in within its rights to count them ... but not if the report wants to measure the right thing. The report is purporting to measure the quality and quantity of the research put out by the department, so counting non-research faculty is misleading at best.

Conclusion

Between this post and the last, I've found some serious problems in the National Academies' graduate school rankings report. Several of the easiest-to-quantify measures they include simply don't pass the smell test. They are either measuring the wrong thing, or they're complete bullshit. Either way, it's a problem.

(Or, as I said before, the numbers given have been transformed in some peculiar, undocumented way. Which I suppose would mean at least they were measuring the right thing, though reporting misleading numbers is still a problem.)
------
*Before anyone makes any Hauser jokes, he was on the faculty, so he would have been included in the National Academies' report, which is what we're discussing here. In any case, removing him would not drastically change the publications-per-faculty rate.

The Best Graduate Programs in Psychology

UPDATE * The report discussed below is even more problematic than I thought.

The National Academies' just published an assessment of U.S. graduate research programs. Rather than compiling a single ranking, they rank programs in a number of different ways -- and also published data on the variables used to calculate those different rankings -- so you can sort the data however you like. Another aspect to like is that the methodology recognizes uncertainty and measurement error, so they actually estimate an upper-bound and lower-bound on all of the rankings (what they call the 5th and 95th percentile ranking, respectively).

Ranked, Based on Research

So how do the data come out? Here are the top programs in terms of "research activity" (using the 5th percentile rankings):

1. University of Wisconsin-Madison, Psychology
2. Harvard University, Psychology
3. Princeton University, Psychology
4. San Diego State University-UC, Clinical Psychology
5. University of Rochester, Social-Personality Psychology
6. Stanford University, Psychology
7. University of Rochester, Brain & Cognitive Sciences
8. University of Pittsburgh-Pittsburgh Campus, Psychology
9. University of Colorado at Boulder, Psychology
10. Brown University, Cognitive and Linguistic Sciences: Cognitive Sciences

Yes, it's annoying that some schools have multiple psychology departments and thus each is ranked separately, leading to some apples v. oranges comparisons (e.g., vision researchers publish much faster than developmental researchers, partly because their data is orders of magnitude faster/easier to collect; a department with disproportionate numbers of vision researchers is going to have an advantage).

What is nice is that these numbers can be broken down in terms of the component variables. Here are rankings in terms of publications per faculty per year and citations per publication:

Publications per faculty per year


1. State University of New York at Albany, Biopsychology
2. University of Wisconsin-Madison, Psychology
3. Syracuse University Main Campus, Clinical Psychology
4. San Diego State University-UC, Clinical Psychology
5. Harvard University, Psychology
6. University of Pittsburgh-Pittsburgh Campus, Psychology
7. University of Rochester, Social-Personality Psychology
8. Florida State University, Psychology
9. University of Colorado-Boulder, Psychology
10. State University of New York-Albany, Clinical Psychology

Average Citations per Faculty


1. University of Wisconsin-Madison, Psychology
2. Harvard University, Psychology
3. San Diego State University-UC, Clinical Psychology
4. Princeton University, Psychology
5. University of Rochester, Social_Personality Psychology
6. Johns Hopkins University, Psychological and Brain Sciences
7. University of Pittsburgh-Pittsburgh Campus, Psychology
8. University of Colorado-Boulder, Psychology
9. Yale University, Psychology
10. Duke University, Psychology

So what seems to be going on is that there are a lot of schools on the first list which publish large numbers of papers that nobody cites. If you combine the two lists in order to get average number of citations per year per faculty, here are the rankings. I'm including numbers this time so you can see the distance between the top few and the others. The #1 program doubles the rate of citations of the #10 program.

Average Citations per Faculty per Year


1. University of Wisconsin-Madison, Psychology - 13.4
2. Harvard University, Psychology - 12.7
3. San Diego State University-UC, Clinical Psychology - 11.0
4. Princeton University, Psychology - 10.6
5. University of Rochester, Social-Personality Psychology - 10.6
6. Johns Hopkins University, Psychological and Brain Sciences - 8.8
7. University of Pittsburgh-Pittsburgh Campus, Psychology - 8.3
8. University of Colorado-Boulder, Psychology - 8.0
9. Yale University, Psychology - 7.5
10. Duke University, Psychology - 6.9

The biggest surprise for me on these lists is that University of Pittsburg is on it (it's not a program I hear about often) and that Stanford is not.

Student Support

Never mind about the research, how do the students do? It's hard to say, partly because the variables measured aren't necessarily the ones I would measure, and partly because I don't believe their data. The student support & outcomes composite is build out of:

Percent of first year students with full financial support
Average completion rate within 6 years
Median time to degree
Percent of students with academic plans
Program collects data about post-graduation employment

That final variable is something that would only be included by the data-will-save-us-all crowd; it doesn't seem to have any direct relationship to student support or outcome. The fact that they focus on first year funding only is odd. I think it's huge that my program guarantees 5 years -- and for 3 of those we don't have to teach. Similarly, one might care whether funding is tied to faculty or given to the students directly. Or whether there are grants to attend conferences, mini-grants to do research not supported by your advisor, etc.

But leaving aside whether they measured the right things, did they even measure what they measured correctly? The number that concerns me is "percent of students with academic plans," which is defined in terms of the percentage that have lined up either a faculty position or a post-doctoral fellowship by graduations, and which is probably the most important variable of those they list in terms of measuring the success of a research program.

They find that no school has a rate of over 55% (Princeton). Harvard is at 26%. To put none to fine a point on it, hat's absurd. Occasionally our department sends out a list of who's graduating and what they are doing next. Unfortunately, I haven't saved any of them, but typically all but 1 or 2 people are continuing on to academic positions (there's often someone who is doing consulting instead, and occasionally someone who just doesn't have a job lined up yet). So the number should be closer to 90-95% -- not just at Harvard, but presumably at peer institutions.

This makes me worried about their other numbers. In any case, since the "student support" ranking is so heavily dependent on this particular variable, and that variable is clearly measured incorrectly, I don't think there's much point in looking at the "student support" ranking closely.

Caveat emptor: Is academia a pyramid scheme?

That's the question on the blogs this week (see here and here). The question arises because each professor will have some number of students during their career (10-20 is common among the faculty I know), whereas the number of professorships increases very slowly. So the number of PhDs being produced far exceeds the number of academic positions.

As pointed out elsewhere, this neglects the fact that many PhD students have no intention of going into academia. Even so, it's clear the system is set up to produce more graduates who want academic jobs than there are jobs available. Prodigal Academic wonders if that's any different from any profession -- generally, there are more people who want the best jobs than there are best jobs to go around. Unlike PA, who doesn't think there's a problem, Citation Needed thinks most people entering graduate school aren't aware of how unlikely it is that they will get a tenure-track job, partly because it isn't in the schools' interest to mention this.

It depends

I largely agree with these fine posts, but I think they overgeneralize. Not all PhD programs are the same. Different fields vary wildly in terms of number of students produced, the likelihood of getting an industry job, etc., and also in terms of the caliber of the program. For instance, nearly every graduate of the psych program at Harvard goes on to get a tenure-track job. A sizable percentage get tenure-track jobs at the top institutions (Harvard, Yale, UChicago, etc.).

On the other hand, at even highly-respected but lower-ranked schools, getting a tenure-track job seems to be the exception. Here I have less personal experience, but a friend from Harvard who is a post-doc at a well-known state school was surprised to discover basically none of the students in that program expected to get an academic job. I've heard similar stories from a few other places.

A common problem

This isn't unique to academia. Many people believe lawyers earn a lot of money. Much fuss is made in the New York Times about how starting salary at a major law first is around $170,000/year (or was, prior to the Great Recession). While basically anyone who graduates from the top three law schools who wants such a job can get one (some go into lower-paying public-interest or public-service work), at most law schools, few if any graduates land such jobs and most lawyers never earn anywhere near that money. As a first approximation, nobody who graduates from law school lands a big firm job, just as, as a first approximation, nobody with a PhD gets a tenure track job at a top research institution.

From my vantage point, the problem is that media (newspapers, movies, etc.) fixate on the prosperous tip of the iceberg. Newspapers do this because their target audience (rather, the target audience of many of the advertisers in newspapers) are people who themselves graduated from Harvard or Yale and for whom getting a tenure-track job or being partner at a major law firm is a reasonably common achievement. Movies and television shows do this for the same reason everyone is beautiful and rich on the screen -- nobody ever said Hollywood was realistic.

This is fine as it goes, but can get people into trouble when they don't realize (a) that the media is presenting the outliers, not the norm, and/or (b) just where their own school/program fits into the grand scheme of things. As Citation Needed points out, it's not necessarily in the interest of less successful schools to warn incoming students that their chances of a job are poor. And, particularly in the realm of undergraduate education, there are certainly there are schools who cynically accept students knowing that their degree is so worthless that the students will almost certainly default on their loans.

What to do

Obviously the real onus is on the student (caveat emptor) to make sure they know what their chances of getting the job they want are prior to matriculating -- and this is true for every degree, not just PhDs. For most schools -- undergraduate and particularly graduate -- you can get data on how graduates fare in the marketplace. This can help determine not only which school to go to but whether it's worth going to school at all (it may not be). But to the extent it is in society's interest that people aren't wasting time and money (often as not, taxpayer money), it is worth considering how, as a society, we can make sure that not only is the information available, but people know that it's available and where to get it.

The Academic Job Market Tanks

"This is a year of no jobs." Ph.D.s are stacked up "like planes hovering over La Guardia. -- Catherine Stimpson, dean of the Graduate School of Arts and Sciences at New York University.

The above quote is taken from a recent article in the New York Times. Although people usually flock to graduate school in a down economy, the down economy means fewer spots in graduate school. This is just as well, it seems, if there are fewer jobs for graduating Ph.D.s.

The article is based mostly on anecdote, but the anecdotes match what I have seen as well. A graduate student from UT-Austin frets that more and more job searches have been pulled as universities announce hiring freezes. Two colleagues of mine who were on the market this year also reported jobs they had applied for disappearing. One has managed to find a post-doc position; the future of the other is uncertain.

For those who want numbers, there are a few in the article. It reports 15% drop in history department job searches and a 25% drop in the length of the American Mathematic Association's largest list of job postings.

In addition to the problems faced by people on the market, this is problematic for a country that wants to increase its intellectual output. Ph.D.s are long and hard and not worth it if there is no job at the end. Discouraging employment figures are not going to help the president's goal of increasing our nation's supply of scientists and engineers. To the extent that the work of historians and area-studies researchers informs policy, it seems we'd want to make sure there are employment prospects for humanities students as well.

Again, the Times has no numbers, but the article quoted a few discouraged undergraduates who are putting off graduate study (though frankly I don't think going straight from undergraduate to graduate programs is a good idea, anyway). Moreover, they point to Thomas Benton, a columnist for The Chronicle of Higher Eduction -- academia's trade journal -- who has been actively discouraging students from going into the humanities, arguing that it makes no sense unless you are wealthy or well-connected. I'm not sure undergraduates read the Chronicle, but the existence of that sentiment is troubling.

Ours is a knowledge-driven economy. Everybody seems to recognize that in the push to get more Americans to go to college. Hopefully, there will be professors there to teach them.

Getting in to Graduate School

It's standard dogma that when the economy is bad, people go back to school. Although it doesn't appear to be major news yet, a number of schools are reporting an increase in applications (here and here, but see also here).

Despite an increase in applications, it is very possible fewer people will actually go to graduate school. This recession may be unique.

There are two problems. First, masters, MD and JD programs are very expensive, and students typically require loans. I shouldn't have to elaborate on why this might present a difficulty for the prospective graduate student right now.

Second, universities are cutting the number of students they are admitting. I don't have systematic numbers, but I know that the Harvard Graduate School of Arts and Sciences is reducing the number of students admitted for PhD programs. If the richest university in the country is slashing enrollment, I don't think I'm going out on too far a limb in assuming others are as well. Large private universities are depending on their endowments (i.e., the stock market) to cover operating expenses, and students are expensive. State schools are dependent on government financing, which is also drying up.

It is obvious why PhD students at a school like Harvard are an expense: instead of paying tuition, they are paid a salary by the school. I don't know if the enrollment cut will hit the professional schools. It is well-known that undergraduate programs are typically run at a short-term loss (tuition does not cover expenses), with the school figuring they'll make up the difference in future alumni donations. I do not know, but suspect, that the same is true for the professional schools. That said, the only schools at Harvard right now that don't seem to have a hiring freeze are the Law and Medical schools.

As I said, this is not being widely reported, and I do not have numbers for the industry as a whole. Hopefully I am wrong, because such a trend would be bad. During a recession, more people suddenly have time for school. When the recovery comes, it meets a better-educated and more capable workforce, (presumably) further fueling the recovery. This time, the opposite may happen.

Getting a Ph.D. in psychology

Some may have noticed that my posts have been infrequent for the last week or two and wondered why. There is a simple answer to this:

Quals.

What are quals? They seem to be different in different universities, and quite possibly even between different departments. The top Google hit for "qualifying exam" sounds absolutely nothing like what I am doing. This seems to be true of graduate school in general, which is to say that there policies differ a great deal. I certainly got into trouble as a prospective graduate student by assuming that information I learned about one graduate program would generalize to another. 

One purpose of this blog is to make more information about the process available. So, for those who are interested:

As far as I can tell, the traditional qualifying exam is an examination that qualifies one to work on a Ph.D. That certainly seems to be the case in Piled Higher and Deeper, which is set at Stanford (see the comic below). 

In my department, it works very differently. Our qualifying exams are rolled into a course we take during our first year (usually). This before we get our Master's degree, which is typically at the end of the second year. 

The course is different depending on which research group you belong to. My research group (developmental) actually requires students to take our own qualifying course as well as another. I took the developmental course last semester and am taking the cognition, brain and behavior course this semester.

What is required for the courses can vary a great deal depending on which professor is in charge. This semester, we have a total of 63 hours of examination spread out over 6 tests -- three in the middle of the semester, and three this week. Which is why I have not been posting much.


Interview Daze

First-year graduate students in my program are in charge of organizing interviews for prospective graduate students. We were given notice last Friday; the interviews start this coming Tuesday. So it's been a busy week.

When I applied to PhD programs in Psychology the first time around, I didn't know there were interviews. Most department websites don't mention them, and the only people I knew who had been to graduate school recently were in other fields and didn't do interviews. So I applied to graduate school and went to Spain for the spring, and was very surprised when I started getting invitations to visit schools. A friend of mine recently told she also had no idea interviews would be required. It turns out, in fact, that some schools do interviews and some do not. It is extremely difficult to find out which schools are which.

I bring up this story, because I think it is emblematic of the graduate school admissions process, at least for psychology. Information is scarce, and the procedure varies considerably from school to school. I don't know whether knowing more about the process would help you get into a program, but it seems reasonable to assume so. In that case, there would be a significant advantage for people already on the inside.

To put this into a concrete example, suppose you want to get a PhD in psychology at Harvard. If you are an undergraduate at Stanford or Yale, it's very likely that your professors can tell you a lot about the admissions process at Harvard (which is quite different from that at Stanford or Yale, as it turns out), because there is a lot of cross-talk between those three schools. If you are an undergraduate at a regional public university, it's much less likely you can get access to that kind of information.

Access to information may not translate into access to admissions. I certainly hope it does not. But, on the off-chance that it does, one goal of this blog is to give more information about the admissions processes to the extent that I can. If any aspiring students have questions, you should be sure to ask.