Field of Science

All Scientists Have Conflicts of Interest (Duh)

In a thoughtful and provocative piece in the The Wild Side blog at NYTimes.com, Stephen Quake takes up the issue of conflicts of interest in research. By "conflicts of interest," Quake means researchers who have a financial interest in the outcome of their research. It is becoming increasingly common for academic researchers to partner with businesses in developing new technology.

He raises many important and interesting questions, some of which I'll write about in the future. One point that deserves much more space was the fact that many researchers have conflicts of interest even when they aren't selling a product:

Interestingly, it is not unusual for basic scientists with no commercial relationships to be dependent on grants for their salaries and therefore have a significant personal financial interest in preserving their grants. Although COI experts have assured me that that this is not a conflict that needs to be managed, I must confess that I have some difficulty with the distinction they are trying to draw. Who is under greater temptation to bias the results of their research: the financially comfortable academic entrepreneur, or the ivory tower scientist who may not be able to pay his mortgage if his grant is not renewed?
What may not be clear to the casual reader is that research agencies like the NIH prefer to give grants to researchers with a history of success. Last time they gave you a grant, did you publish a series of important papers, or did all your projects end in failure? In the latter case, you may be out of a job.

Nobody Sees Your Data But You

A baseball player is paid largely on his ability to perform on the field. If you are known as a power-hitter, you better produce home-runs.

The major difference between a baseball star and a research star, is that a baseball player's performance is public. Everybody at the game knows whether he hit a home run or struck out.

In contrast, the only person with access to a researcher's data is the researcher. It is as if a baseball player went to an automatic batting cage where nobody was looking, took a few swings, came out and told the team management how many home runs he hit and was paid accordingly.

A number of non-scientists I've talked to seem to be under the impression that during peer review, the reviewers check the data. They don't. They can't, really. In my experiments, I ask people questions and mark down whether they got it right or wrong. The reviewers weren't in the testing room with me, so they simply have to take my word for it.

Now let's say you are a researcher and it's time to renew your grant. You've been swinging and missing -- your data are uninterpretable or simply uninteresting. The easiest way to make the data more interesting is to "fix" them.

All Scientists Have Conflicts of Interest

Ultimately, all scientists have a conflict of interest, because our promotions and salaries are ultimately based on the research we have produced, whether directly or indirectly. And there are many non-financial incentives. Nobody wants to be seen as a failure.

I don't believe many people are simply making up their data (though it happens). A larger concern is selectively reporting results. Suppose you are in the situation in which you have run four different experiments. Three of them support one conclusion, but the fourth supports the opposite conclusion.

You simply can't publish that. No journal will take a paper showing conflicting results. You can either give up on the project and admit to having wasted perhaps 2-3 years of your life. You can continue doing research to try to figure out why you are getting conflicting results, perhaps succeeding, perhaps not, but in any case spending time and money you might not have. Or you can write a paper about the first three experiments and forget about the "bad" experiment. Some people take the latter route. This is best-known in pharmaceutical research, but it happens everywhere.

Even worse, perhaps you run an experiment, the results of which challenge the theory for you which you are known, an experiment which challenges the validity of your life's work. Who really wants to publish that?

What to Do

Quake's point is that you can't eliminate conflicts of interest from science. A baseball player needs to win games. A researcher needs to publish. As long as this is true, baseball players will have an incentive to take steroids and researchers will have incentives to "improve" their work. Any proposed solution that ignores these basic facts is doomed to fail

Are Scientists Really Cheating?

This article might have looked really gloomy. My point is simply that everybody has the incentive to cheat, not that everybody does. In the course of your life, there will be a number of instances in which it would be in your financial interests to murder somebody. Still, most people don't murder.

It is popular in some circles to assume most people are fundamentally bad and will do anything they can get away with, but my experience -- professionally and personally -- is otherwise.

Moreover, academic fraud is generally not in one's long-term interests. Even if it isn't exposed, others will fail to replicate your results and your theories will be disproved. Again, for a researcher, ultimately the best thing for your career is to be right. And fraud won't help you with that.


(Image sourced from blog.springsource.com)

Obama's Science Budget

The administration has released a previous of its proposed budget. After years of stagnant funding for research, there are obviously many people interested to know whether Obama will live up to promises of investing in science.

There is no centralized science directive, so it is hard to evaluate the budget as a whole. However, what I can find looks promising. The NSF 2010 budget includes a $950 million (16%) increase over 2008. NASA is similarly getting a $1.5 billion dollar (9%) raise over 2008 levels. This is of course on top of large supplements to both programs through the Stimulus Package. The National Institutes of Health (NIH) is unfortunately buried within a larger department, and there isn't enough information to evaluate it. The Department of Energy appears to be getting increased research funding, but I wasn't able to track down exact numbers.

Concerns about the deficit aside, after so many years of depressing, damaging budgets, it is hard not to be excited. Someone at the top seems to actually care.

What the Stimulus Package Means for Science

When the American Recovery and Reinvestment Act went through Congress, I assumed my favorite provisions would be stripped out. Instead, the Senate increased the funding for NIH to over $6 billion. By the time President Obama signed the law, the total influx into NIH had reached $10.4 billion over two years (out of an annual budget of $29 billion, which has been flat for 6 years). I'm not sure when the extra $2 billion for NSF entered the bill, but that money is on its way as well.

Readers of this blog know that this is a major change in US policy. I had hopes for this administration, but that this is actually a lot more than I hoped for.

Once the celebration is done, though, it's important to note that even if these funding increases become permanent, it will at best help us keep pace with the rest of the world, rather than continuing to fall behind the leaders. It's certainly not enough to guarantee America's future as the leading producer of science.

This New York Times article has more information on how the science stimulus moneys will be
spent.

New Experiment: Learning Verbs

I run a graduate student research workshop for students in the linguistics and psychology departments here at Harvard. We did a segment on word-learning. One of our guest-speakers started her presentation by pointing out that the field often things of word learning as learning nouns, despite the fact that there are many other types of words that probably have to be learned differently.

The experiments at the Cognition & Language Laboratory have been guilty of this oversight (this one for example). I'm happy to say that this flaw has been rectified. Late last week I posted the GorpTest, in which you can try to learn new verbs.

I'll be writing more on this topic in the near future. In the meantime, please participate.

Emotions Caused by Your Brain

Yesterday, the New York Times ran a science article under the following heading: In Pain and Joy of Envy, the Brain May Play a Role. It is a very well-written and engaging article of the subject of envy, which is a fascinating emotion.

The title, however, leaves something to be desired. It seems to imply that there was some doubt over whether the brain plays a role in envy. The modern scientific consensus is that the brain plays a role in all emotions.

So why write this article? To tell the truth, the article doesn't fit the title very well: the article isn't really about proving a role for the brain in envy. A better title might have been "The Science of Envy." I suspect the editors chose the title because it is well-known that people believe cognitive research more if you stick the word "brain" in a few places. In journalistic parlance, neuroscience is a good hook.

However, as a public service, and in probably a vain attempt to forestall future articles with titles such as "In Pain and Joy of love/hate/surprise/etc., the Brain May Play a Role," here are a list of emotions and psychological states (courtesy of WordNet) in which the brain certainly plays a role, to the extent we can be certain about anything:

abashment
abhorrence
absolutethreshold
acidity
acridity
activity
addiction
affection
agape
aggravation
aggression
aim
alarm
anaphrodisia
angst
animus
antagonism
anxiety
anxiousness
aphrodisia
appetite
ardor
aroma
arousal
assumption
astringence
attention
attitude
attributing
auditoryperception
awareness
awe
badtemper
beholding
belief
belligerence
beneficence
benevolence
bitter
blessedness
bloodlust
boding
bold
boredom
brave
breakdown
brightnessconstancy
caprice
caring
cautious
chafe
chagrin
chill
chromesthesia
class feeling
cogitation
cold
cold feet
colorconstancy
coloredhearing
comfortzone
competence
concentration
concept
conditioned response
confidence
conscience
consideration
constancy
constriction
construction
construing
contemplation
contrast
copulation
cowardly
craving
creeps
curiosity
curious
cutaneoussensation
deficit
deliberation
desire
despisal
detection
devotion
differencethreshold
diffidence
discombobulation
discomfiture
discontentment
disgruntlement
displeasure
dissatisfaction
dread
dream
driven
dudgeon
dysphoria
ecstasy
edginess
education
elation
embarras
embarrassmentsment
embitterment
emotionalstate
empathy
emulation
enamoredness
engrossment
enmity
estimate
euphoria
exhilaration
expectation
exuberance
facerecognition
faculty
faintness
fear
fed
feeling
finish[winetasting]
fit
fondness
fondregard
fragility
frisson
frustration
fundamental
fury
gaiety
generosity
gloom
gloomy
gratefulness
gratification
gratitude
hackles
hankering
happiness
harassment
harmonic
hatred
health
heart
heartstrings
homesickness
horror
huffiness
humility
hunch
hysteria
idea
image
immersion
impatient
impression
incautious
incense
incentive
indignation
infirmity
infuriation
insecurity
intention
interest
interpretation
interpreting
intimidation
intoxication
introspection
intuitions
irascibility
ire
itching
jealousy
jitteriness
jnd
joy
jubilance
judgement
judgment
lazy
lecherousness
lemon
letdown
levity
libido
limen
lividity
longing
love
lovesickness
loyalty
maleficence
malevolence
malice
malodor
masking
meekness
melody
misanthropy
misocainea
misogamy
misogyny
misology
misoneism
misopedia
mittelschmerz
motivation
motive
murderousness
music
music[music]
musicalperception
musicofthespheres
musings
musk
nationalism
naughty
need
nervous
niff
nostalgia
nymphomania
objective
objectiveorgoal
objectrecognition
open
opinion
opticalfusion
outlook
overtone
pain>twinge
painthreshold
panic
passion
patient
perspective
pessimism
phantom limb pain
piece
pining
pins and needles
pique
plan
pointofview
pondering
position
pressure
projection
protectiveness
proud
prurience
pruritus
pruritusani
pruritusvulvae
puppylove
purpose
qualityoflife
racket
radiance
rationalized
reaction
reasoning
reflection
regard
relish
representation
resentment
reserve
responsible
sadness
salt
satyriasis
scare
scent
scruple
security
seethe
self-depreciation
self-disgust
selfish
sensation>odor
sense
sensuality
serenity
serious
sexiness
sexualdesire
shame
shamefacedness
shapeconstancy
shyness
side
sinking
sizeconstancy
smell
softspot
somesthesia
sound>tone
sour
speculation
speechperception
stage fright
stance
state
stomach
supposing
surmising
suspense
suspicious
sweet
sweettooth
sympathy
synesthesia
taste>tang
temperature>warmth
temptation
tenderness
terror
thinking
thought
tickle
timid
timidness
tingle
titillation
to stop loving
tolerant
tone
topognosia
touch
trepidation
triumph
trust
twinge
umbrage
uneasiness
unhappiness
up
urge
urtication
vanilla
velleity
view
viewpoint
vigil
vindictiveness
vision
visualspace
wakefulness
warmth
weakness
whistful
whitenoise
willies
wishfulness
wishing
wistfulness
withdrawn
wor
worship
wrath

Getting in to Graduate School

It's standard dogma that when the economy is bad, people go back to school. Although it doesn't appear to be major news yet, a number of schools are reporting an increase in applications (here and here, but see also here).

Despite an increase in applications, it is very possible fewer people will actually go to graduate school. This recession may be unique.

There are two problems. First, masters, MD and JD programs are very expensive, and students typically require loans. I shouldn't have to elaborate on why this might present a difficulty for the prospective graduate student right now.

Second, universities are cutting the number of students they are admitting. I don't have systematic numbers, but I know that the Harvard Graduate School of Arts and Sciences is reducing the number of students admitted for PhD programs. If the richest university in the country is slashing enrollment, I don't think I'm going out on too far a limb in assuming others are as well. Large private universities are depending on their endowments (i.e., the stock market) to cover operating expenses, and students are expensive. State schools are dependent on government financing, which is also drying up.

It is obvious why PhD students at a school like Harvard are an expense: instead of paying tuition, they are paid a salary by the school. I don't know if the enrollment cut will hit the professional schools. It is well-known that undergraduate programs are typically run at a short-term loss (tuition does not cover expenses), with the school figuring they'll make up the difference in future alumni donations. I do not know, but suspect, that the same is true for the professional schools. That said, the only schools at Harvard right now that don't seem to have a hiring freeze are the Law and Medical schools.

As I said, this is not being widely reported, and I do not have numbers for the industry as a whole. Hopefully I am wrong, because such a trend would be bad. During a recession, more people suddenly have time for school. When the recovery comes, it meets a better-educated and more capable workforce, (presumably) further fueling the recovery. This time, the opposite may happen.

Why Don't Babies Talk Like Adults: Coglanglab at Scientific American

The Scientific American Mind Matters blog is running an article by me on language learning. The moderator of this blog, Jonah Lehrer, asks scientists to pick what they think is one of the most exciting recent papers and blog about it.

Here's how I set up the problem:

Many people assume children learn to talk by copying what they hear. In other words, babies listen to the words adults use and the situations in which they use them and imitate accordingly. Behaviorism, the scientific approach that dominated American cognitive science for the first half of the 20th century, made exactly this argument. This “copycat” theory can’t explain why toddlers aren’t as loquacious adults, however. After all, when was the last time you heard literate adults express themselves in one-word sentences (“bottle,” “doggie”) or in short phrases such as, “Mommy open box.”
The rest of the post describes what I think is one of the most important recent language experiments and how it addresses this paradox.

Steven Pinker on Roberts-speak

If you haven't yet seen it, check out this New York Times editorial by Harvard Professor of Psychology, Steven Pinker. It is an analysis of (perhaps) why Chief Justice Roberts bungled the inaugural swearing-in.

The Assistant Village Idiot has a rather strange rebuttal. The author seems to believe Pinker's editorial was a political commentary. Well, it is, but Pinker is concerned about the politics of language (something he's worried about for a long time, as anyone who has read his books knows), not Supreme Court politics. The writer continues:

Pinker seems unable to restrain himself from injecting his political opinions into his discussions of language and thought. I wonder what that means?
One might ask the same question right back.

Have you ever shruck?


"Remember the time in 2003 when Bartlett came to work all hung over?" Laughs. "Nothing ever changes."
[Bush] continued: "We never shruck—"
"Shirked!" someone yelled.
"Shirked," Bush corrected, smiling. "You might have shirked; I shrucked. I mean we took the deals head on."
This is an excerpt from an account of George W. Bush's farewell party at the Spanish Ballroom in Glen Echo (which I know better as a middling swing dance venue; apparently the better places were all booked).

A number of people have been making hay about Bush's creative past tense inflection of the verb shirk. This is probably because it fits with the general perception of Bush as barely literate. Not to defend one of the nation's most disastrous presidents, but shirk is actually a hard verb to decline.

The Psychology of the Past Tense

Most of us were taught in school that to make the past tense of a verb (walk) you add an -ed (walked). Of course it turns out there are some irregular verbs (ran, slept) which have to be memorized as such. A simple theory would just state that these exceptions are on a metaphoric list: when an English speaker wants to put a verb into the past tense, she checks the list of exceptions first. If the verb is on the list, she uses that irregular form; if not, she adds -ed.

This seems like a decent theory, but it doesn't quite work. This is because people are perfectly capable of coming up with new irregular past tense forms. Suppose you heard a verb splink, which means to fall into a pool of water. What do you think the past tense would be? Many people would guess splunk. Our list model can't explain this, since that irregular isn't on the list. However, it seems clear where splunk comes from: it's an anology to sink-sunk.

In fact, historically some verbs have become irregular. Once upon a time, the past tense of creep was creeped. So clearly people are capable of inventing new irregular forms that aren't on the metaphoric list.

The Past Tense Wars

How to fix this model was the focus of the far-reaching Past Tense Debate, in which I was once a minor participant. Although everybody's theory came to predict new irregular forms, none of the theories were very good at predicting a particular form (why splunk instead of splought, on analogy to think-thought? For some very interesting recent work on this problem, check out a series of recent papers by Adam Albright).

When I was doing this work, I would present participants with made up verbs and ask them to give me a past tense. I got a lot of responses like splunk, but I also got very odd responses. It was not infrequent for a person to add or subtract a consonant (sadly, I don't remember any of the best examples). Many looked a good deal like turning shirk to shruck. Granted, few people made such big mistakes on the real words (other than the infamous brung), but it seems clear Bush has a deficiency in (linguistic) planning and monitoring, so one would expect his irregularizations to be more prominent. (I'm actually sympathetic to the linguistic malady, at least, since I'm similarly inarticulate when speaking off the cuff.)

Parting Thoughts

As usual, Language Log got to this topic first and used much more impressive vocabulary (e.g., "metathetic").

Why a People Don't Panic During a Plane Crash


A lot has been made about the the crew and passengers of United Flight 1549 and their failure to panic when their plane landed in the Hudson. For instance, here is the Well blog at the New York Times:
Amanda Ripley, author of the book “The Unthinkable: Who Survives When Disaster Strikes — and Why” (Crown, 2008), notes that in this plane crash, like other major disasters, people tend to stay calm, quiet and helpful to others.

“We’ve heard from people on the plane that once it crashed people were calm — the pervading sound was not screaming but silence, which is very typical ... The fear response is so evolved, it’s really going to take over in a situation like that. And it’s not in your interests to get hysterical. There’s some amount of reassurance in that I think.’’
On a different topic, but along the same lines, the paper's Week in Review section discusses the fact that most people are coping with the recent economic collapse reasonably well, all things considered:
Yet experts say that the recent spate of suicides, while undeniably sad, amounts to no more than anecdotal, personal tragedy. The vast majority of people can and sometimes do weather stinging humiliation and loss without suffering any psychological wounds, and they do it by drawing on resources which they barely know they have.
Should we be surprised?
This topic has come up here before. People are remarkably bad at predicting what will make them happy or sad. Evidence shows that while many people think having children will make them happy, most people's level of happiness actually drops significantly after having children and never fully recovers even after the kids grow up. On the other end of the scale, the Week in Review article notes that
In a recently completed study of 16,000 people, tracked for much of their lives, Dr. Bonanno, along with Anthony Mancini of Columbia and Andrew Clark of the Paris School of Economics, found that some 60 percent of people whose spouse died showed no change in self-reported well-being. Among people who’d been divorced, more than 70 percent showed no change in mental health.
This makes a certain amount of sense. Suppose the mafia threatens to burn down your shop if you don't pay protection money, and suppose you don't pay. They actually have very little incentive to follow through on the threat, since they don't actually want to burn down your shop -- what they want is the money. (This, according to psychology Steve Pinker, is one of the reasons people issue threats obliquely -- "That's a nice shop you have here. It'd be a shame if anything happened to it." -- so that they don't have to follow through in order to save face.)

Similarly, biology requires that we think we'll like having children in order to motivate us to have them. Biology also requires that we think our spouse dying would ruin our lives, in order to motivate us to take care of our spouse. But once we have children or our spouse dies, there is very little evolutionary benefit accrued by carrying through on the threat.

Finding the idea of a plane crash very scary: useful.
Mass panic and commotion during a crash: not so much.

Presidential Words

The New York Times has an interesting feature analyzing inaugural addresses. An interactive tool displays and visually ranks the most common words in each presidential inaugural address in US history. This is a common method for determining what the common themes are in a piece of work, and as such it captures both a moment in time and the style of a particular president.

Can Your Brain Force You to Do Something You Don't Want to Do?

I have been reading Jerome Kagan's compelling recent book on emotion. I stumbled on one particular line:

An article in the June 20, 1988, issue of Time magazine, reporting on a woman who murdered her infant, told readers that the hormonal changes that accompany the birth process create emotional states, especially in women unprepared for the care of children, that can provoke serious aggression that women are unable to control. It is thus not fair, the journalist argued, to hold such mothers responsible for their horrendous actions. This conclusion is a serious distortion of the truth. There is no known hormonal change that can force a woman to kill her infant if she does not want to do so!
This raises one of most difficult problems facing 21st Century ethics. We want to treat criminals differently if they are in control of their actions. For instance, a soldier who is ordered to commit an atrocity is, if still guilty, a bit less guilty than one who does the same thing, but just for kicks.

When the outside influence constraining your free will actually arises within your own body, it's a bit more difficult. Suppose Alfred goes on a drug-induced killing spree. Again, it's different from the just-for-kicks murderer, but then one might wonder if Alfred should have thought of the consequences before injecting himself with psychotics. Or what about somebody who had a psychotic break? Where do we draw the line between that and a bad mood?

Many people used to be comfortable drawing the line between psychosis and a bad mood using medical information. Anyone who acts under the influence of a medical condition is less culpable (or, at least, differently culpable) than somebody who is not. However, neuroscientists find the brain correlates of conditions like a bad mood and geneticists find that nearly every personality trait is heritable (including being just plain mean), this line is breaking down.

To be fair, this is in essence not a new problem. Certain strains of Christian religious thinkers have spent centuries tying themselves into knots trying to explain how, given that everything is according to God's plan, including sin, it was not sacrilege to punish sinners, who, by definition, were just carrying out God's plan.

Nonethess, Christian civilizations did not collapse under the weight of this paradox, and I suspect we'll get along for some time without a coherent answer to the Great Question of Free Will. But it would still be nice to have...

----
Kagan (2007) What Is Emotion, p. 80

What is the Longest Sentence in English?

Writers periodically compete to see who can write the longest sentence in literature. James Joyce long held the English record with a 4,391 word sentence in Ulysses. Jonathan Coe one-uped him in 2001 with a 13,955 word sentence in The Rotter's Club. More recently, a single-sentence, 469,375 word novel appeared.

Will they ever run out of words?

No. It's easy to come up with a long sentence if you want to, though typing it out may be a chore. Here's a simple recipe:

1. Pick a sentence you like (e.g., "'Twas brillig and the slithy toves did gyre and gimble in the wabe.")

2. Add "Mary said that" to the beginning of your sentence (e.g., "Mary said that 'twas brillig and the slithy toves did gyre and gimble in the wabe.")

3. Add "John said that" to the beginning of your new sentence (e.g., "John said that Mary said that 'twas brillig and the slithy toves did gyre and gimble in the wabe.")

4. Go back to step #2 and repeat.

If you keep this up long enough, you'll have the longest sentence in English or any other language.

Why this matters.

There are reasons to care about this other than immortalizing your name. This formula is a proof by demonstration that language learning is not simply a matter of copying what you have heard others say. If this was true, nobody could ever make a longer sentence than the longest one they had ever heard.

However, making longer sentences is not simply a matter of stringing words together. You can't break the longest-sentence record by stringing together the names "John" and "Mary" 469,376 times. That wouldn't be a sentence.

This exercise is one of the most famous proofs that language has structure, and speakers of a language have an intuitive understanding of that structure (the other famous proof arguably being the sentence Colorless green ideas sleep furiously.).

Modern Environmentalism, or Which Would You Rather Save: Your Planet or Your Soul?

According to Johann Hari, writing for Slate, there are at least two kinds of environmentalists: the romantics and the realists.

The romantics—a tradition you can peel back to Wordsworth's daffodils—see environmental crises as primarily spiritual. They believe concrete and cities and factories are fundamentally inhuman, alienated habitats that can only make us sick. They cut us off from the natural rhythms of the land, and encourage us to break up the world into parts and study them mechanistically—when, in fact, everything is connected...

The rational environmentalists ... believe our crisis is not spiritual at all, but physical. Human beings didn't unleash warming gases into the atmosphere out of malice or stupidity or spiritual defect: They did it because they wanted their children to be less cold and less hungry and less prone to disease. The moral failing comes only very late in the story—when we chose to ignore the scientific evidence of where wanton fossil-fuel burning would take us. This failing must be put right by changing our fuel sources, not altering our souls.
We might recast these as those who want to return to nature, and those who want to preserve nature:

Diagnose the problem differently, and you end up with fundamentally different solutions. You can see this most clearly if you look at the environmentalist clash over cities, over how we should live: Is the way forward to build more cities or to try to get people to flee to the countryside?
Personally, I'm a realist (as is Hari). Hari, though, tries to make the case for realism by pointing to the thought processes involved (e.g., "the cities of human beings are as natural .. as are the colonies of prairie dogs or the beds of oysters."). However, I think realism is the only option for those of us who live in the real world -- and I'm not using that phrase metaphorically: really mean anyone who lives on Planet Earth.

The Reality on the Ground.
Here's the problem: There are over 6,000,000,000 people on Earth. There are only 58,179,688 square miles of land. That is over 103 people per square mile. If we fan out into the countryside, there will be no countryside. And keep in mind that those 58,179,688 square miles of land include the Earth's deserts (14% of land) and high mountains (27% of land).

I love Nature. I love spending time in Nature. (I'd love to publish in Nature, too, but that's a different topic.) The only way it will be possible for any sizeable chunk of people to spend time in Nature is for most of us to live in cities -- frankly, in cities more dense than the ones that exist today in America.

It may be true that there are too many people, but before anyone suggests we start reducing the world population, keep in mind that if we want to ge to the point where there is only 1 person per square mile, over 99% of humanity must disappear. Personally, that's not my environmentalism fantasy.

Even Experts Don't Know what Brain Scans Mean

For some reason, many people find neuroscience more compelling than psychology. That is, if you tell them that men seem to like video games more than women, they are unconvinced, but if you say that brain scans of men and women playing video games showing that the pleasure centers of their brains respond to video games, suddenly it all seems more compelling.

More flavors is more fun, and the world can accept variation in what types of evidence people find compelling -- and we're probably the better for it. In this case, though, there is a problem in that neuroscientific data is very hard to interpret. Jerome Kagan said it perfectly in his latest book, so I'll leave it to him:

A more persuasive example is seen in the reactions to pictures that are symbolic of unpleasant (snakes, bloodied bodies), pleasant (children playing, couples kissing), or neutral (tables, chairs) emotional situations.The unpleasant scenes typically induce the largest eyeblink startle response to a loud sound due to recruitment of the amygdala. However, there is greater blood flow to temporal and parietal areas to the pleasant than to the unpleasant pictures, and, making matters more ambiguous, the amplitudes of the event-related waveform eight-tenths of a second after the appearance of the photographs are equivalent to the pleasant and unpleasant scenes. A scientist who wanted to know whether unpleasant or pleasant scenes were more arousing could arrive at three different conclusions depending on the evidence selected.
Daniel Engber in Slate has more excellent discussion of this problem.

Similarly, many posts ago, I noted that another Harvard psychologist, Dan Gilbert, prefers to simply ask people if they are happy rather than use a physiological measure because the only reason we think a particular physiological measure indicates happiness is because it correlates with people's self-reports of being happy. In other words, using any physiological measure (including brain scans) as indication of a mental state is circular.


----
Kagan (2007) What Is Emotion, pp. 81-82.

----
PS Since I've been writing about Russian lately, I wanted to mention an English-language Russian news aggregator that I came across. This site is from the writer behind the well-known Siberian Light Russia blog.