Saturday, December 23, 2006

Culture and climate

Brandeis historian David Hackett Fischer pointed out in his famous Albion's Seed that racial differences had an enormous impact on the history of America. He notes that the cold climate of colonial Massachusetts

"proved to be exceptionally dangerous to immigrants from tropical Africa, who suffered severely from pulmonary infections in New England winters. Black death rates in colonial Massachusetts were twice as high as whites' - a pattern very different from Virginia where mortality rates for the two races were not so far apart, and still more different from South Carolina where white death rates were higher than those of blacks. So high was mortality among African immigrants in New England that race slavery was not viable on a large scale, despite many attempts to introduce it. Slavery was not impossible in this region, but the human and material costs were higher than many wished to pay. A labor system which was fundamentally hostile to the Puritan ethos of New England was kept at bay partly by the climate."

Not surprisingly, in the 19th Century, Massachusetts became the home of abolitionism. South Carolina became the home of secession.

Friday, December 15, 2006

Masculinity and Perceived Status by Females

Physical Strength in Men Correlates with Attractiveness
Am J Hum Biol. 2006 Dec 7;19(1):82-87

Male facial appearance signals physical strength to women. Fink B, Neave N, Seydel H.Previous studies showed that male faces with extreme features that are likely to be associated with testosterone (T) are perceived as dominant and masculine. Women were reported to prefer masculinized male faces, as they may consider T markers to be an "honest" indication of good health. [and a holdover from the days when physical strength equated with status and leadershipTOM]

However, it is also likely that female preferences for certain male faces arise from the fact that dominant-and masculine-looking males are signaling characteristics which maybe beneficial in intrasexual conflict, and thereby also indicate potential achievers of high status, an important factor in female mate selection.

Although numerous studies were built on this assumption, nothing is known about the relationship between perceived facial dominance and physical strength in men. We measured hand-grip strength, as a measure of overall physical strength, in a sample of 32 male students, and recorded age, body weight, and height. Seventy-nine women rated facial images of these men for dominance, masculinity, and attractiveness.

After controlling for age and bodyweight, hand-grip strength was found to correlate significantly positively with all three measures. The present data thus support the supposition that a male's physical strength is also signaled via facial characteristics of dominance and masculinity, which are considered attractive by women.

Saturday, December 02, 2006

Gene Variant in Abused Boys Linked to Antisocial Behavior

by Joan Arehart-Treichel

When maltreatment is combined with having the short MAOA gene variant, it may put children on track for antisocial behavior later on. Genetic screening, however, may not be a strategy for preventing such behavior.

For four years, the case has been building that a short variant of the monoamine oxidase a (MAOA) gene, when combined with harsh discipline, physical abuse, or other forms of maltreatment, puts youngsters at risk for antisocial behavior.

In 2002, as noted in an earlier post, Avshalom Caspi, Ph.D., of the Institute of Psychiatry in London and his coworkers were the first to report such a connection in a sample of more than 400 young men who had been followed since childhood. In 2004, Debra Foley, Ph.D., an assistant professor of human genetics at Virginia Commonwealth University, and colleagues reported that they had made the same association in youth aged 8 to 17 (Psychiatric News, September 3, 2004). And now Julia Kim-Cohen, Ph.D., an assistant professor of psychology at Yale University, and her team have reported the association once again, but in 7-year-old children. They published their findings in the October Molecular Psychiatry.

This latest inquiry included 975 boys. At ages 5 and 7, the boys had DNA samples taken and were assessed for physical mal-treatment, such as fractures, dislocations, or being burned with matches. The scientists then looked to see whether boys who had been physically abused and possessed the short variant of the MAOA gene (16 subjects) had significantly greater mental health problems than did boys who had been physically abused and possessed the long variant of the gene (46 subjects). This turned out to be the case.

The researchers then classified the mental health problems into emotional difficulties, antisocial behavior, or attention and hyperactivity difficulties. Again they found that children who possessed the short MAOA gene version and who had been exposed to physical abuse had more emotional problems, more antisocial behavior, and more attention and hyperactivity difficulties than did those children with the long gene version who had been abused. However, only the attention-hyperactivity results yielded a clear statistical significance.

Source: Psychiatry News

Sunday, November 19, 2006

Speeded up evolution can be predicted

Pressured by predators, lizards see rapid shift in natural selection

November 17, 2006 - CAMBRIDGE, Mass. -

Countering the widespread view of evolution as a process played out over the course of eons, evolutionary biologists have shown that natural selection can turn on a dime - within months- as a population's needs change. In a study of island lizards exposed to a new predator, the scientists found that natural selection dramatically changed direction over a very short time, within a single generation, favoring first longer and then shorter hind legs. The findings, by Jonathan B. Losos of Harvard University and colleagues, are detailed this week in the journal Science.Losos did much of the work before joining Harvard earlier this year from WashingtonUniversity in St. Louis.

"Because of its epochal scope, evolutionary biology is often caricatured as incompatible with controlled experimentation," says Losos, professor of organismic and evolutionary biology in Harvard's Faculty of Arts and Sciencesand curator in herpetology at the Harvard Museum of Comparative Zoology. "Recentwork has shown, however, that evolutionary biology can be studied on short time scales and that predictions about it can be tested experimentally. We predicted,and then demonstrated, a reversal in the direction of natural selection acting on limb length in a population of lizards."

Losos and colleagues studied populations of the lizard Anolis sagrei onminuscule islands, or cays, in the Bahamas. They introduced to six of these caysa larger, predatory lizard (Leiocephalus carinatus) commonly found on nearby islands and known as a natural colonizer of small cays. The scientists kept sixother control cays predator-free and exhaustively counted, marked, and measured lizards on all 12 isles.

Anolis sagrei spends much of its time on the ground, but previous research has shown that when a terrestrial predator is introduced, these lizards take totrees and shrubs, becoming increasingly arboreal over time. Losos and hiscolleagues hypothesized that immediately following a predator's arrival,longer-legged - and hence faster-running - Anolis lizards would be favored to elude capture. However, as the lizards grew ever more arboreal in habitat, the scientists projected that natural selection would begin to favor shorter limbs, which are better suited to navigating narrow branches and twigs.

Their hypothesis was borne out. Six months after the introduction of the predator, Losos found that the Anolis population had dropped by half or more on the islands with the predators, and in comparison to the lizards on the predator-free islands, long legs were more strongly favored: Survivors had longer legs relative to non-survivors. After another six months, during which time the Anolis lizards grew increasingly arboreal, selective pressures were exactly the opposite: Survivors were now characterized by having shorter legs on the experimental islands as compared to the control islands.

The behavioral shift from the ground to higher perches apparently caused this remarkable reversal, Losos says, adding that behavioral flexibility may often drive extremely rapid shifts in evolution.

“Evolutionary biology is by its nature an historical science, but the combination of microevolutionary experimentation and macroevolutionary historical analysis can provide a rich understanding about the genesis of biological diversity,” the researchers write.

Friday, November 03, 2006

"My genes made me do it"

Psychiatric News November 3, 2006Volume 41, Number 21, page 12© 2006 American Psychiatric Association

Jury Still Out on Impact Of Genes on Trial Verdicts

Mark Moran

Americans should not be surprised to hear that claim made by criminal defendants as the genetics of behavior, especially antisocial behavior, are explored by science and popularized.

Paul Appelbaum, M.D., chair of APA's Council on Psychiatry and Law, told psychiatrists at APA's 58th Institute on Psychiatric Services last month that the findings of behavioral genetics—even such preliminary findings as have been made to date— are making their way into the American legal system.

He predicted, however, that genetic arguments are not likely to be successful in freeing defendants from guilt for their crimes, but may more likely be advanced in criminal cases as mitigating factors that should be taken into account in sentencing. Yet even there it remains to be seen how a genetic propensity will be viewed by juries and judges; such evidence could just as conceivably be seized upon as an argument against a defendant, Appelbaum said.

"If effective treatment becomes available, the pressure to identify [at-risk individuals] through screening at birth may be irresistible."

Still, the groundwork for the logic of a genetic defense, in the form of the insanity defense, has already been laid by centuries of case law.

"Anglo-American law has created categories to excuse defendants from culpability when their capacity to choose their behavior is significantly impaired," Appelbaum said. "If mental disorders that impair appreciation of wrongfulness or ability to control behavior negate culpability, why shouldn't genetic determinants have the same effect?

"Why should there not be a defense of genetic determinism, a `my genes made me do it' defense? The logic [of moving] from the existing insanity defense to such an argument is not so absurd that it has not already begun to make an appearance in our courts."

Linking MAOA and Violence

Already rippling through the legal system with intriguing implications is a landmark study by Avshalom Caspi, Ph.D., and colleagues that appeared in Science in 2002 demonstrating a remarkable interaction between a specific genetic configuration and early childhood experiences in the development of antisocial disorder.

Drawing on a sample of more than 400 males in Dunedin, Scotland, who had been followed since childhood for 26 years, Caspi and colleagues were able to examine the levels of monoamine oxidase A (MAOA) activity in those who did and did not exhibit antisocial behavior, including violence, in later years.

MAOA is an enzyme that sits on mitochondrial membranes in neurons and degrades several important neurotransmitters, including several believed to be important in the regulation of aggression and impulsivity. Previous animal research had shown that the absence of MAOA was associated with increased aggression. Levels of MAOA activity differ based on variation in the "promoter region" of the MAOA gene, which controls the transcription of the DNA into messenger RNA.

Caspi and colleagues found from their longitudinal work with the Dunedin sample that low MAOA activity was not itself predictive, but that low MAOA activity in combination with a history of child abuse or neglect was predictive of antisocial behavior, including violence. Individuals with low MAOA activity and severe maltreatment comprised just 12 percent of the sample, but they accounted for 44 percent of the violent crimes committed by the sample. . . .

Are you out of shape?


COLUMBUS, Ohio – Researchers may get some indication of how aggressively an angry person will react by measuring the size relationship between a person’s ears and other body parts, according to a new study.

Research showed that the farther certain paired body parts were from symmetry – if one ear, index finger or foot was bigger than another, for example – the more likely it is was that a person would show signs of aggression when provoked. The symmetry effects were different in men and women, however.

While the findings may seem strange, there is a plausible explanation, said Zeynep Benderlioglu, co-author of the study and a post-doctoral researcher at Ohio State University.

Deviations from symmetry are thought to reflect stressors during pregnancy – such as poor health, alcohol and tobacco use – that may affect development of the fetus in a variety of ways.

"Paired body parts are presumably controlled by similar genetic instructions, so if everything goes perfectly you would expect paired body parts to be the same size," Benderlioglu said.

"But stressors during pregnancy may lead to asymmetrical body parts. The same stressors will also affect development of the central nervous system, which involves impulse control and aggression," she said. "So while asymmetry doesn’t cause aggression, they both seem to be correlated to similar factors during pregnancy."

Benderlioglu conducted the study with Randy Nelson, a professor of psychology and neuroscience, and Paul Sciulli, professor of anthropology, both at Ohio State. Their results were published in a recent issue of the American Journal of Human Biology....

Benderlioglu said the same conditions in pregnancy that lead to asymmetry in body parts probably affects development of parts of the central nervous system that deal with impulse control. The result is that people with higher levels of asymmetry also have a harder time controlling their aggressive impulses.

Other studies have indicated that testosterone is related to a tendency toward anger. So people who show both high levels of asymmetry and high levels of testosterone may react particularly aggressively when provoked, she said....

The results emphasize once again the importance of healthy habits during pregnancy, Benderlioglu said. Smoking and heavy alcohol use are among the stressors that may lead to both asymmetry and poor impulse control....

Thursday, November 02, 2006

Guns, Genes, and Steel

Whistling Past the Graveyard
by John Derbyshire (Nov. 2006)
Reviewing Mark Steyn's new book, America Alone
. . . .

Ah, culture. Of course it’s not about race! Nothing is about race, because there is no such thing as race. (Repeat 100 times.) It’s about culture—the aether, the phlogiston, of current social-anthropological speculation, whose actual nature is mysterious, but whose explanatory power is infinite. You know, culture: those habits, folkways, beliefs, ways of thinking and behaving and connecting that arise from... pure chance! Or geography (see below). Or something... but definitely nothing to do with biology.

Please don’t get me wrong. I am sure Mark Steyn is sincere here. I am sure he believes this stuff about “culture.” Most educated people do. Most will continue to do so for a few more years, while the neuroscientists, geneticists, genomicists, anthropologists, paleontologists, and statistical sociologists sap away beneath them—until the ground gives way. (A professional academic biologist friend of mine is in the habit of snapping out, any time anyone takes refuge in this “culture” stuff: “Culture? Culture? What does that mean? Where does it come from? What are the upstream variables?”)

One of the great anthropological-historical best-sellers of recent years was Jared Diamond’s Pulitzer-prize-winning Guns, Germs, and Steel. Different human populations, in different parts of the world, says Diamond, developed different cultures, depending on whether they had draft animals to hand, easy routes for disease transmission, and so on. Diamond almost completely ignores the role of inheritance and natural selection in shaping human populations. Natural selection? That all came to a screeching halt 50,000 years ago, don’t you know, when homo sapiens showed up. There have been no biological changes since then, none at all! Certainly none that affect behavior or socialization. We are all exactly the same structurally, we just behave differently according to the local geography. Location, location, location.

Alas, our understanding of population genetics has already left Jared Diamond behind. Good, solid scientific studies are beginning to appear that altogether refute the “culture” paradigm. We are not a uniform species, inclined to different folkways by the pressures of geography. A population of human beings, breeding mostly within itself, is shaped by the menu of genetic peculiarities it started out with, and by its breeding practices (did you know that 55 percent of Saudi marriages are between first or second cousins?), and by its environment. We are not a uniform species at all. Not many world-wide species are.

Tuesday, October 31, 2006

Moral Minds

New York Times
October 31, 2006
Books on Science
An Evolutionary Theory of Right and Wrong

Who doesn’t know the difference between right and wrong? Yet that essential knowledge, generally assumed to come from parental teaching or religious or legal instruction, could turn out to have a quite different origin.

Primatologists like Frans de Waal have long argued that the roots of human morality are evident in social animals like apes and monkeys. The animals’ feelings of empathy and expectations of reciprocity are essential behaviors for mammalian group living and can be regarded as a counterpart of human morality.

Marc D. Hauser, a Harvard biologist, has built on this idea to propose that people are born with a moral grammar wired into their neural circuits by evolution. In a new book, “Moral Minds” (HarperCollins 2006), he argues that the grammar generates instant moral judgments which, in part because of the quick decisions that must be made in life-or-death situations, are inaccessible to the conscious mind....

The proposal, if true, would have far-reaching consequences. It implies that parents and teachers are not teaching children the rules of correct behavior from scratch but are, at best, giving shape to an innate behavior. And it suggests that religions are not the source of moral codes but, rather, social enforcers of instinctive moral behavior.

Both atheists and people belonging to a wide range of faiths make the same moral judgments, Dr. Hauser writes, implying “that the system that unconsciously generates moral judgments is immune to religious doctrine.” Dr. Hauser argues that the moral grammar operates in much the same way as the universal grammar proposed by the linguist Noam Chomsky as the innate neural machinery for language. The universal grammar is a system of rules for generating syntax and vocabulary but does not specify any particular language. That is supplied by the culture in which a child grows up.

The moral grammar too, in Dr. Hauser’s view, is a system for generating moral behavior and not a list of specific rules. It constrains human behavior so tightly that many rules are in fact the same or very similar in every society — do as you would be done by; care for children and the weak; don’t kill; avoid adultery and incest; don’t cheat, steal or lie.

But it also allows for variations, since cultures can assign different weights to the elements of the grammar’s calculations. Thus one society may ban abortion, another may see infanticide as a moral duty in certain circumstances. Or as Kipling observed, “The wildest dreams of Kew are the facts of Katmandu, and the crimes of Clapham chaste in Martaban.”

Matters of right and wrong have long been the province of moral philosophers and ethicists. Dr. Hauser’s proposal is an attempt to claim the subject for science, in particular for evolutionary biology. The moral grammar evolved, he believes, because restraints on behavior are required for social living and have been favored by natural selection because of their survival value....

The moral grammar now universal among people presumably evolved to its final shape during the hunter-gatherer phase of the human past, before the dispersal from the ancestral homeland in northeast Africa some 50,000 years ago. This may be why events before our eyes carry far greater moral weight than happenings far away, Dr. Hauser believes, since in those days one never had to care about people remote from one’s environment....

Dr. Hauser believes that the moral grammar may have evolved through the evolutionary mechanism known as group selection. A group bound by altruism toward its members and rigorous discouragement of cheaters would be more likely to prevail over a less cohesive society, so genes for moral grammar would become more common.

Many evolutionary biologists frown on the idea of group selection, noting that genes cannot become more frequent unless they benefit the individual who carries them, and a person who contributes altruistically to people not related to him will reduce his own fitness and leave fewer offspring.

But though group selection has not been proved to occur in animals, Dr. Hauser believes that it may have operated in people because of their greater social conformity and willingness to punish or ostracize those who disobey moral codes.

“That permits strong group cohesion you don’t see in other animals, which may make for group selection,” he said....

Wednesday, October 25, 2006


October 24, 2006

Op-Ed Columnist
New York Times

One Nation, Divisible

An American in Iraq has finally gotten it almost right.

J. D. Thurman, the major general who is the senior commander of U.S. forces in Baghdad, has figured out the obstacle to America’s dream for Iraq.

“Part of our problem is that we want this more than they do,” General Thurman told The Times’s Michael Gordon, alluding to American efforts to unify Iraqis. “We need to get people to stop worrying about self and start worrying about Iraq.”

That’s a refreshingly candid alternative to the usual lines we hear about the Iraqi people’s patriotism and resolve. General Thurman predicted that Americans will keep struggling unless Iraqis put aside their differences. Quite right — and quite depressing, because they’re not about to do it, no matter what timetable the U.S. tries to impose.

But what’s stopping them is not selfishness. When General Thurman talked about the conflict between serving oneself and serving one’s country, he was applying an American template to a different culture. Rampant individualism is not the problem in Iraq.

The problem is that they have so many social obligations more important to them than national unity. Iraqis bravely went to the polls and waved their purple fingers, but they voted along sectarian lines. Appeals to their religion trumped appeals to the national interest. And as the beleaguered police in Amara saw last week, religion gets trumped by the most important obligation of all: the clan.

The deadly battle in Amara wasn’t between Sunnis and Shiites, but between two Shiite clans that have feuded for generations. After one clan’s militia destroyed police stations and took over half the city, the Iraqi Army did not ride to the rescue. Authorities regained control only after the clan leaders negotiated a truce.

When the U.S. invaded Iraq, American optimists invoked Germany and Japan as models for their democratization project, but Iraq didn’t have the cultural cohesion or national identity of those countries. The shrewdest forecasts I heard came not from foreign policy experts but from anthropologists and sociologists who noted a crucial statistic: nearly half of Iraqis were married to their first or second cousins.

Unlike General Thurman and other Westerners, members of these tightly knit Iraqi clans don’t look on society as a collection of individuals working for the common good of the nation. “In a modern state a citizen’s allegiance is to the state, but theirs is to their clan and their tribe,” Ihsan M. al-Hassan, a sociologist at the University of Baghdad, warned three years ago. “If one person in your clan does something wrong, you favor him anyway, and you expect others to treat their relatives the same way.”

These allegiances explain why Iraqis don’t want to give up their local militias. They know it’s unrealistic to expect protection from a national force of soldiers or police officers from other clans, other regions, other religions. When the Iraqi Army ordered reinforcements to go help Americans keep peace in Baghdad, several Iraqi battalions deserted rather than risk their lives defending strangers.

Instead of trying to transform Iraqis into patriots and build up national security forces, the U.S. should be urging decentralization. The national government should concentrate on defending the borders and equitably distributing oil revenue, ideally by distributing shares of the oil wealth directly to citizens.

Most other duties, including maintaining law and order, should devolve to autonomous local governments: one for the Kurdish north, one for the Sunni Triangle, one for the Shiite south, plus coalition governments in Baghad and the multiethnic region around Kirkuk. The result would hardly be peace. There would still be murderous religious conflicts in Baghdad and fierce interclan battles in towns like Amara.

But the local leaders — elected officials, police officers, sheiks, clerics — would be in a better position to provide security and negotiate truces than would a national government. It’s no accident that the most stable part of Iraq is also the most autonomous: Kurdistan, where two rival clans have negotiated a relatively peaceful coexistence.

It wouldn’t be easy for Iraqis in other regions to work out their differences, but the local leaders would have one crucial advantage over any Iraqis or Americans giving orders from Baghdad. They would realize their neighbors are not going to suddenly embrace national unity. They would know you make peace with the citizenry you have, not the one you want.

Friday, October 20, 2006

Positive feedback relationship

It is admittedly difficult to prove that mankind has changed biologically since, let us say, the days of the ancient Greeks and Romans, if by "proof" you mean demonstration of sizeable gene differences. We cannot test the genes of Pericles or Caesar or their contemporaries. But neither was Darwin able to "prove" organic evolution in this sense. The evidence is indirect, inferential, but nevertheless, I think, conclusive.

Paradoxically, it is precisely because we know that mankind changes so greatly culturally that we can be so confident that it changes to some extent also genetically. When the environment changes, the only other necessary condition for the occurrence of genetic evolutionary change can be defined. This is the presence in human popluations of genetic variants, some of which confer upon their carriers a higher fitness....

Despite all the inadequacies of our present knowledge of human genetics, this can scarcely be doubted. What is more, since the environment in which man lives is in the first place his sociocultural environment, the genetic changes induced by culture must affect man's fitness for culture and hence may affect culture. The process thus becomes self-sustaining.

Biological changes increase the fitness for, and the dependence of their carriers on culture, and therefore stimulate cultural developments; cultural developments in turn instigate further genetic changes. This amounts to a positive feedback relationship between the cultural and the biological evolutions. The positive feedback explains the great evolutionary change, so great that it creates the illusion of an unbridgeable gap between our animal ancestors and ourselves....

Dobzhansky T. 1963. Anthropology and the natural sciences - The problem of human evolution. Curr Anthropol 4:138+146-148

Thursday, October 19, 2006

The likelihood of more recent genetic evolution

... the capacity for innovation in behaviorally modern humans materially speeded up evolution, because it led to frequent innovation, and every significant innovation created a mismatch with the environment and, therefore, new selective pressures. Look at the Bushmen: they're 4' 8" and hunt big game. They couldn't do it without poisoned arrows and, back before missile weapons, no one did: early humans were bigger and built like linebackers. The bow begat the Bushmen.

Take agriculture: the switch to reliance upon cereals cut protein intake almost threefold while reducing protein quality and greatly increasing the percentage of high-glycemic carbohydrates in the diet (along with other changes) That put huge areas of metabolism under selective pressure - towards more robust glucose regulation, towards changes that conserve protein, especially essential/scarce amino acids. Check out the distribution of diabetes - it's not 'thrifty genes', it's pre- and post agricutlural adaptations.

... Female-farming systems would seem likely to select for reduced paternal investment and increased intermale competition and display: the selected myostatin mutations, along with the regional differences in the androgen receptor may fit into this picture.

...A more complex and hierarchical society (with far greater reproductive skew) must have selected for different cognitive and personality profiles - not just among the Ashkenazi or the Chinese, but in all civilized populations. Look, all those personality and cognitive traits have substantial heritability, so selection happened.... For that matter, different kinds of hunter-gatherer ecology selected for different traits; Eskimoes are not Bushmen are not Negritos.

...without modern medical care, Amerindian and other long-isolated peoples are incredibly vulnerable to Old World infectious diseases, enormously more so than Eurasians and Africans. The domestication of cows turned Northern Europeans into mampires that live off the milk of another species.

Sewing, the atlatl, pottery, writing - all changed peoples.

The recent adaptations to agriculture are of course not even found in hunter-gatherers, but agriculturalists in different parts of the world have mostly experienced different genetic changes... For example, the genetic basis of skeletal gracilization in Europe/Middle East appears to be fundamentally different than that in China, lactose tolerance in the Masai is caused by a different allele than in Europe, while the genetic basis of light skin color is entirely different in China and Europe.

Which implies that the causal mechanism, the nuts and bolts of >100 IQ, likely a consequence of post-agricultural adaptation, probably differ significantly between East and West, just as the details of skin color do. The psychometric subtructure sure looks different.

Greg Cochran, commenting on the Gene Expression blog, disagreeing with the prevalent assumption that evolution stopped well over 10,000 years ago .

Wednesday, October 11, 2006

Clothes unmake the woman

"Even after her husband was executed, Marie Antoinette defied her captors by ordering mourning dress, seeking solace in the illusion that had set her on her unlucky course: the notion that by controlling her image, she could master her fate. Bound for the chopping block, deprived of her widow’s weeds, she still contrived to have a clean-lined martyr’s costume smuggled into her cell. She was the first woman of whom it truthfully could be said that she shopped until she dropped"

Liesl Schillinger reviewing

What Marie Antoinette Wore to the Revolution.
By Caroline Weber

in The New York Times Book Review

Tuesday, October 10, 2006

Bred for Agression

Nicolas Wade once again identifies some meaningful research and makes it available to the general reader. If such work in neuroscience and genetics shows how brains, including human brains, predispose us to certain kinds of productive or counterproductive actions, why don't our modes of intervention (including, of course, punishment) reflect such findings? We are caught in a vicious cycle where adverse environments reinforce genetically influenced behviour. This is cause for much pessimism, particularly since a whole welfare industry with a vested interest in maintaining the status quo tends to frustrate achieving any real modification, if modifications can indeed be made in laissez faire societies....


October 10, 2006 ~ The New York Times

Flyweights, Yes, but Fighters Nonetheless: Fruit Flies Bred for Aggressiveness

By Nicholas Wade

What can stand on its hind legs and duke it out with its front feet, boxing and tussling like a four-armed pugilist?

The answer: a strain of laboratory fruit flies bred for shameless aggressiveness toward their own kind.

These miniature gladiators flail at each other with a zeal and tempo that make professional boxers look like milquetoasts. A video shows a Drosophilan version of Mike Tyson forcing an opponent to fly the ring.

The fighting flies have been bred by Herman A. Dierick and Ralph J. Greenspan, two biologists at the Neurosciences Institute in San Diego. Their goal is to discover the neural circuits that are genetically modified when flies develop aggressive behavior.

Fruit flies in the wild are quite hostile toward one another. Males will fight off other males from prize real estate, like a rotten peach, where females like to congregate. But when kept in the laboratory, subsequent generations soon become domesticated.

Dr. Dierick and Dr. Greenspan figured that since this behavior was easily lost, it should be easy enough to regain if the right selective pressure were applied. So they took a laboratory strain of tame fruit flies and set up pots of food that could be protected by single males.

The males that fought the hardest in these encounters were sucked off their little arenas with a pipette and rewarded by becoming the fathers of the next generation.

More aggressive males started to appear after only 5 generations, and by the 21st generation, Dr. Dierick found that the aggressiveness of male fruit flies had increased more than 30-fold, according to a scoring system he developed.

At this point he was able to perform an experiment that would have been quite messy had he been working with larger animals. He chopped off the heads of 100 aggressive individuals, ground them up and ran a test to measure changes in the activity of their brain genes.

About 80 genes — the fruit fly has about 14,000 — were either more or less active in the brains of the aggressive flies, compared with flies of the original population from which they were selected.

Two of the most changed genes were ones involved in the detection of pheromones, the hormonelike scents with which fruit flies signal their activities.

One of these genes seems to make the aggressive flies unusually sensitive to the pheromones emitted by other males. Another, which is repressed in the aggressive flies, mediates sensitivity to the pheromone with which male flies mark their territory. The aggressive flies seem less able to recognize others' boundaries.

Dr. Dierick hopes to identify the neural circuit in the fly's brain that mediates aggressive behavior and that is modulated up or down by inputs from pheromones and other sources. Dr. Greenspan said an understanding of how genes set up circuits to govern behavior would be of broad significance in understanding what makes either flies or people tick.

Saturday, September 09, 2006

On national character traits

"As Gov. Schwarzenegger implied, the more outgoing West African personality contribution in Puerto Ricans makes them more extroverted on average that Mexicans, whose Native American contribution inclines them toward solidity and introversion. (Spaniards tend to be fairly extroverted, so mestizos come in a wide range of introversion to extroversion, while mulattos are more consistently extroverted. That may have something to do with why the smaller Caribbean population has made a bigger impact on American popular culture then the very large Mexican-American population -- e.g.., Jennifer Lopez, and American-born Puerto Rica won the role of Selena, the American-born Tex-Mex singer."

-From Steve Sailer's blog, September 8,2006


The New Atlantis, Summer 2003

Eugenics—Sacred and Profane
Christine Rosen


"A recent working paper by the President’s Council on Bioethics noted that “as genomic knowledge increases and more genes are identified that correlate with diseases, the applications for preimplantation genetic diagnosis will likely increase greatly,” including for medical conditions such as cancer, mental illness, or asthma, and non-medical traits such as temperament or height. “While currently a small practice,” the Council working paper declares, “PGD is a momentous development. It represents the first fusion of genomics and assisted reproduction—effectively opening the door to the genetic shaping of offspring.”

In one sense, of course, PGD poses no new eugenic dangers. Genetic screening using amniocentesis has allowed parents to test the fitness of potential offspring for years. But PGD is poised to increase this power significantly: It will allow parents to choose the child they want, not simply reject the one they do not want. It will change the overriding purpose of IVF, which began as a treatment for infertility but now aims, as one prominent fertility expert has said, “to help prospective parents realize their own dreams of having a disease-free legacy.” Over time, PGD will be used not simply to spare the birth of “doomed children,” who would be born with diseases that kill in the first few years of life, but to avoid the birth of children with a higher chance of getting certain illnesses later in life. And it will do so, as one ethicist described, “only by picking and choosing embryos like consumer goods—producing many, discarding most, and desiring only the chosen few.”

Sunday, August 13, 2006

The culture of getting away with it

David Brooks, in his column below, fails to mention a key variable affecting national character traits, group intelligence, which is to say, the compilation of individual IQ by country or region. Corruption may correlate with capacity which then leads to cultural practices. Hong Kong and Singapore, and Chile for that matter, eliminated corruption in a few decades because the residents had the collective wherewithal to fashion a civil society with the necessary rules which were eventually woven into the ethos of the countries.


August 13, 2006
Op-Ed Columnist

The Culture of Nations

Diplomats in New York rack up a lot of unpaid parking tickets, but not all rack them up at the same rates. According to the economists Raymond Fisman and Edward Miguel, diplomats from countries that rank high on the Transparency International corruption index pile up huge numbers of unpaid tickets, whereas diplomats from countries that rank low on the index barely get any at all.

Between 1997 and 2002, the U.N. Mission of Kuwait picked up 246 parking violations per diplomat. Diplomats from Egypt, Chad, Sudan, Mozambique, Pakistan, Ethiopia and Syria also committed huge numbers of violations. Meanwhile, not a single parking violation by a Swedish diplomat was recorded. Nor were there any by diplomats from Denmark, Japan, Israel, Norway or Canada.

The reason there are such wide variations in ticket rates is that human beings are not merely products of economics. The diplomats paid no cost for parking illegally, thanks to diplomatic immunity. But human beings are also shaped by cultural and moral norms. If you’re Swedish and you have a chance to pull up in front of a fire hydrant, you still don’t do it. You’re Swedish. That’s who you are....

For several decades a veteran foreign aid worker, Lawrence E. Harrison, has contemplated the power of culture in shaping behavior. He’s concluded that cultural differences mostly explain why some nations develop quickly while others do not.

All cultures have value because they provide coherence, but some cultures foster development while others retard it. Some cultures check corruption, while others permit it. Some cultures focus on the future, while others focus on the past. Some cultures encourage the belief that individuals can control their own destinies, while others encourage fatalism.

In a new book, “The Central Liberal Truth,” Harrison takes up the question that is at the center of politics today: Can we self-consciously change cultures so they encourage development and modernization? Harrison is writing about poverty, but this is incidentally a book about the war on terror, and whether it is possible to change culture in the Middle East and the ghettos of Muslim Europe.

On the one hand, Harrison is an optimist. He has taken his title from one of Daniel Patrick Moynihan’s greatest observations: “The central conservative truth is that it is culture, not politics, that determines the success of a society. The central liberal truth is that politics can change a culture and save it from itself.”

But when Harrison turns to how politics can change culture, you find he is a man who has been made aware of the limitations on what we can know and achieve. Harrison and a team of global academics studied cultural transformations in Ireland, China, Latin America and elsewhere. They concluded that cultural change can’t be imposed from the outside, except in rare circumstances. It has to be led by people who recognize and accept responsibility for their own culture’s problems and selectively reinterpret their own traditions to encourage modernization.

Harrison observes that gigantic investments in education, and especially in improving female literacy, usually precede transformations. Chile was highly literate in the 19th century, and in 1905, 90 percent of Japanese children were in school. These investments laid the groundwork for takeoffs that were decades away.

Harrison points to many other factors — leaders who encourage economic liberalization, movements that restrict the power of the clerics — but the main impressions he leaves are that cultural change is measured in centuries, not decades, and that cultures are separated from one another by veils of complexity and difference.

If Harrison is right, it is no wonder that young Muslim men in Britain might decide to renounce freedom and prosperity for midair martyrdom. They are driven by a deep cultural need for meaning. But it is also foolish to think we can address the root causes of their toxic desires. We’ll just have to fight the symptoms of a disease we can neither cure nor understand.

Friday, August 11, 2006

Once were warriors: gene linked to Maori violence

MAORIS carry a "warrior" gene that makes them more prone to violence, criminal acts and risky behaviour, a scientist has controversially claimed.

Dr Rod Lea, a New Zealand researcher, and his colleagues told an Australian genetics conference that Maori men had a "striking over-representation" of monoamine oxidase - dubbed the warrior gene - which they say is strongly associated with aggressive behaviour.

He says the unpublished studies prove that Maoris have the highest prevalence of this strength gene, first discovered by US researchers but never linked to an ethnic group.

This explains how Maoris migrated across the Pacific and survived, said Dr Lea, a genetic epidemiologist at the New Zealand Institute of Environmental Science and Research.

But he said the presence of the gene also "goes a long way to explaining some of the problems Maoris have".

"Obviously, this means they are going to be more aggressive and violent and more likely to get involved in risk-taking behaviour like gambling," Dr Lea said before his presentation to the International Congress of Human Genetics in Brisbane.

Dr Lea said he believed other, non-genetic factors might also be at play. "There are lots of lifestyle, upbringing-related exposures that could be relevant here, so obviously the gene won't automatically make you a criminal."

The same gene was linked to high rates of alcoholism and smoking. "In terms of alcohol-metabolising genes we've found that Maori have a very unique genetic signature," Dr Lea said.

"That influences their drinking behaviour, so they're much more likely to binge drink than other groups …"

The researchers are now collecting thousands of DNA samples from Maoris to investigate these traits.

They can then work out precisely what role each gene plays and use this to explore these trends in the mainstream populations.

"With Maori it's easier to find the genes than it is in the broader Caucasian population so it's a great case study," Dr Lea said.

Wednesday, July 05, 2006

East and west speakers make different calculations

Source: scenta

Native English speakers calculate mathematical problems much differently to those who learned Chinese as their first language.

By completing simple arithmetic, the two nationalities were observed to use different parts of their brain, according to a report in Proceedings of the National Academy of Sciences.

Researchers utilised brain imagining techniques to determine which parts of the brain were active when participants completed basic sums, such as four plus five equals nine.

All arithmetic questions were with Arabic numbers, a numeral system familiar in both cultures.

Both the Chinese and English groups utilised a region of the brain known as the inferior parietal cortex, an area connected to quantity representation and reading.

English speakers, however, displayed more activity in the language processing area of the brain, while their Chinese counterparts used the area of the brain that deals with processing visual information.

Lead author Yiyuan Tang of Dalian University of Technology in Dalian, China told the Associated Press that the difference "may mean that Chinese speakers perform problems in a different manner than English speakers".

"In part that might represent the difference in language - it could be that the difference in language encourages different styles of computation and this may be enhanced by different methods of learning to deal with numbers."

The cultural difference of completing mathematical problems may help scientists develop better approaches to making calculations.

Richard Nisbett, co-director of the Culture and Cognition Program at the University of Michigan, said to the Associated Press that "the work is important because it tells us something about the particular pathways in the brain that underlie some of the differences between Asians and Westerners in thought patterns".

Nisbett hopes that the results of the study will lead to better understandings of the different cultures’ mindsets which may provide a basis upon which to learn from each other.

"They literally are seeing the world differently," he added.

Wednesday, June 21, 2006

Slating the thirst for knowledge... an addiction...?

'Thirst for knowledge' may simulate opium craving
General Science : June 20, 2006

Neuroscientists have proposed a simple explanation for the pleasure of grasping a new concept: The brain is getting its fix. The "click" of comprehension triggers a biochemical cascade that rewards the brain with a shot of natural opium-like substances, said Irving Biederman of the University of Southern California. He presents his theory in an invited article in the latest issue of American Scientist.

"While you're trying to understand a difficult theorem, it's not fun," said Biederman, professor of neuroscience in the USC College of Letters, Arts and Sciences.

"But once you get it, you just feel fabulous."

The brain's craving for a fix motivates humans to maximize the rate at which they absorb knowledge, he said.

"I think we're exquisitely tuned to this as if we're junkies, second by second."

Biederman hypothesized that knowledge addiction has strong evolutionary value because mate selection correlates closely with perceived intelligence.
Only more pressing material needs, such as hunger, can suspend the quest for knowledge, he added.

The same mechanism is involved in the aesthetic experience, Biederman said, providing a neurological explanation for the pleasure we derive from art.

"This account may provide a plausible and very simple mechanism for aesthetic and perceptual and cognitive curiosity." [curiosity does not, however, correlate with intelligence-TOM ]

Biederman's theory was inspired by a widely ignored 25-year-old finding that mu-opioid receptors – binding sites for natural opiates – increase in density along the ventral visual pathway, a part of the brain involved in image recognition and processing.

The receptors are tightly packed in the areas of the pathway linked to comprehension and interpretation of images, but sparse in areas where visual stimuli first hit the cortex.

Biederman's theory holds that the greater the neural activity in the areas rich in opioid receptors, the greater the pleasure.

In a series of functional magnetic resonance imaging trials with human volunteers exposed to a wide variety of images, Biederman's research group found that strongly preferred images prompted the greatest fMRI activity in more complex areas of the ventral visual pathway. (The data from the studies are being submitted for publication.)

Biederman also found that repeated viewing of an attractive image lessened both the rating of pleasure and the activity in the opioid-rich areas. In his article, he explains this familiar experience with a neural-network model termed "competitive learning."

In competitive learning (also known as "Neural Darwinism"), the first presentation of an image activates many neurons, some strongly and a greater number only weakly.

With repetition of the image, the connections to the strongly activated neurons grow in strength. But the strongly activated neurons inhibit their weakly activated neighbors, causing a net reduction in activity. This reduction in activity, Biederman's research shows, parallels the decline in the pleasure felt during repeated viewing.

"One advantage of competitive learning is that the inhibited neurons are now free to code for other stimulus patterns," Biederman writes.

This preference for novel concepts also has evolutionary value, he added.

"The system is essentially designed to maximize the rate at which you acquire new but interpretable [understandable] information. Once you have acquired the information, you best spend your time learning something else.

"There's this incredible selectivity that we show in real time. Without thinking about it, we pick out experiences that are richly interpretable but novel."

The theory, while currently tested only in the visual system, likely applies to other senses, Biederman said.

Source: University of Southern California

Saturday, June 17, 2006

"The better part of valor is discretion"

Wall Street Journal. (Eastern edition).
June 16, 2006. pg. A.1

Scientist's Study Of Brain Genes Sparks a Backlash

Antonio Regalado

[Dr. Bruce Lahn of the Univeristy of Chicago] says he is moving away from the research. "It's getting too controversial," he says....

Dr. Lahn had touched a raw nerve in science: race and intelligence.

Dr. Lahn has drawn sharp fire from other leading genetics researchers. They say the genetic differences he found may not signify any recent evolution -- and even if they do, it is too big a leap to suggest any link to intelligence. "This is not the place you want to report a weak association that might or might not stand up," says Francis Collins, director of the genome program at the National Institutes of Health....

Pilar Ossorio, a professor of law and medical ethics at the University of Wisconsin, criticizes Dr. Lahn for implying a conclusion similar to "The Bell Curve," a controversial 1994 bestseller by Richard J. Herrnstein and Charles Murray. The book argued that the lower average performance by African-Americans on IQ tests had a genetic component and wasn't solely the result of social factors. Referring to Dr. Lahn and his co-authors, Prof. Ossorio says: "It's exactly what they were getting at. There was a lot of hallway talk. People said he's doing damage to the whole field of genetics."...

[Lahn] personally believes it is possible that some populations will have more advantageous intelligence genes than others. And he thinks that "society will have to grapple with some very difficult facts" as scientific data accumulate. Yet Dr. Lahn, who left China after participating in prodemocracy protests, says intellectual "police" in the U.S. make such questions difficult to pursue....

Henry Harpending, a University of Utah anthropology professor who recently published a theory for why Ashkenazi Jews tend to have high IQ's, says Dr. Lahn once suggested they co-author an article for Scientific American about the genetics of behavior, in which they could explain why "Chinese are boring."

"I think that Bruce doesn't understand political correctness," Dr.Harpending says. Dr. Lahn says he only vaguely recalls the conversation but confirms that he wonders whether during China's imperial times there was "some selection" against rebellious individuals....

[Lahn's work] suggested brain evolution might have occurred in tandem with important cultural changes. Yet because neither variant is common in sub-Saharan Africa, there was another potential implication: Some groups had been left out....

"You have to follow the data wherever it leads, but speculating in this field is dangerous," says Spencer Wells, head of the National Geographic Society's Genographic Project, a five-year, $40 million effort to collect DNA samples from 100,000 indigenous people. Dr. Wells says the project team might try to find evolutionary reasons for physical differences such as why Danes are taller than pygmies. But Dr. Wells says National Geographic won't study the brain. "I think there is very little evidence of IQ differences between races," he says....

Dr. Lahn stands by his work but says that because of the controversy he is moving into other projects. Earlier this year, Mr. Easton of the university's media department forwarded Dr. Lahn a paper by two economists looking at the IQ of infants of different races. Dr. Lahn wasn't interested. "I'm surprised anyone studies this," he replied in an email.

Dr. Lahn says he isn't as eager as he once was to continue studying brain differences. P. Thomas Schoenemann, a professor of anthropology at the University of Michigan-Dearborn, says that at Dr. Lahn's request he collected DNA from 25 people whose brain sizes he had studied previously. But the two scientists haven't been in touch recently.

The university's patent office is also having second thoughts. Its director, Alan Thomas, says his office is dropping a patent application filed last year that would cover using Dr. Lahn's work as a DNA-based intelligence test. "We really don't want to end up on the front page ... for doing eugenics," Mr. Thomas says.

Monday, June 05, 2006

The role of dominance (power tripping) in the Islamic Threat

The week following the Muslim protests in London against the Danish cartoons—with marchers carrying signs calling for the beheading of infidels—other Muslims demonstrated to claim that Islam really meant peace and tolerance. While their implicit recognition that peace and tolerance are preferable to strife and bigotry, the claim regarding Islam was both historically and intellectually preposterous. Only someone ignorant of the most elementary facts could believe such a thing, or... they are suffering from serious delusions.

From the first, Islam was a religion of pillage, violence, and compulsion, which it justified and glorified. And it is certainly not "the evident truth of the doctrine itself," to quote Gibbon with regard for what, with characteristic irony, he called the primary reason for the rapid spread of Christianity throughout the civilized world, that explains the exponential growth of the Dar-al-Islam in its early history.

It is important, of course, to distinguish between Islam as a doctrine and Muslims as people. Untold numbers of Muslims desire little more than a quiet life; they have the virtues and the vices of the rest of mankind. Their religion gives to their daily lives an ethical and ritual structure and provides the kind of boundaries that only modern Western intellectuals would have the temerity to belittle.

But the fact that many Muslims are not fanatics is not as comforting as some might think.

In his new book, Islamic Imperialism: A History, Professor Efraim Karsh does not mince words about Mohammed's early and (to all those who do not accept the divinity of his inspiration) unscrupulous resort to robbery and violence, or about Islam's militaristic aspects, or about the link between Islamic tradition and the current wave of fundamentalist violence in the world. The originality of Karsh's interpretation is its underlying assumption that Islam was, from the very beginning, a pretext for personal and dynastic political ambition, from the razzias against the Meccan caravans and the expulsion of Jewish tribes from Medina, to the siege of Vienna a millennium later in 1529, and Hamas today.

Contrary to its universalistic pretensions, Karsh argues, Islam has never succeeded in eliminating political power struggles within the Muslim world, where, on the contrary, such struggles have always been murderous. Islamic regimes, many espousing in the beginning the ascetic principles of what one might call desert Islam, invariably degenerate (if it be degeneration) into luxury- and privilege-loving dynasties. Like all other political entities, Islamic regimes seek to preserve and, if possible, extend their power. They have shown no hesitation in compromising with or allying themselves with those whom they regard as infidels.

Saladin, a mendaciously simplified version of whose exploits has inflamed hysterical sentiment all over the Middle East, was not above forming alliances with Christian monarchs to achieve his imperial ends; the Ottoman caliphate would not have survived as long as it did had the Sultan not exploited European rivalries and allied himself now with one, now with another Christian power.

In short, Islamic imperialism, in Karsh's view, illustrates three transcendent political truths: the Nietzschean drive to power, Michels' iron law of oligarchy, and Marx's economic motor of history. Religious feeling, on this reading, is but an epiphenomenon, a mask for what is really going on.

This interpretation raises the difficult and perhaps unanswerable question of what should count in history as a real, and what as merely an apparent, motive for action. When Bernal Diaz del Castillo claims a religious motive for the conquest of Mexico, at least in part, should we just dismiss it as a sanctimonious lie to justify a more rapacious motive? That he ended up a rich man does not decide the question; and Diaz himself would have taken his material success as a sign that God smiled upon his enterprise, just as Muslims have viewed their early conquests as proof of God's approval and the truth of Mohammed's doctrine. (On the other hand, failure for Muslims never seems to provide proof of the final withdrawal of God's favor, much less of his non-existence, but rather shows his dissatisfaction with the current practices of the supposedly faithful, who will return to His favor only by restoring an earlier, purer form of faith.)

Karsh seems to oscillate between believing that Islamic imperialism is just a variant of imperialism in general—imperialism being more or less a permanent manifestation of the human will to power—and believing that there is something sui generis and therefore uniquely dangerous about it.

I hesitate to rush in where so many better-informed people have hesitated to tread, or have trodden before, but I would put it like this. The urge to domination is nearly a constant of human history. The specific (and baleful) contribution of Islam is that, by attributing sovereignty solely to God, and by pretending in a philosophically primitive way that God's will is knowable independently of human interpretation, and therefore of human interest and desire—in short by allowing nothing to human as against divine nature—it tries to abolish politics. All compromises become mere truces; there is no virtue in compromise in itself. Thus Islam is inherently an unsettling and dangerous factor in world politics, independently of the actual conduct of many Muslims.

Karsh comes close to this conclusion himself, when he writes at the end of the book:

Only when the political elites of the Middle East and the Muslim world reconcile themselves to the reality of state nationalism, forswear pan-Arab and pan-Islamic dreams, and make Islam a matter of private faith rather than a tool of political ambition will the inhabitants of these regions at last be able to look forward to a better future free of would-be Saladins.

The fundamental question is whether Islam as a private faith would still be Islam, or whether such privatization would spell its doom. Do we have the luxury of time to find out? Afterall, it took two millennia for Xtianity to become defanged. All the more reason for Epicureans to take to their gardens.

Tuesday, May 30, 2006

E.O. Wilson now places group selection above kin selection

The idea that group selection (or multilevel selection) could have any validity is sometimes dismissed in rather derogatory terms. It may therefore come as a surprise that one of the main "fathers" of ev psych, Edward O. Wilson, now theorizes that kin selection is NOT the why of the evolution of eusocial insects, as widely accepted, but rather group selection -- and the same seems to hold true for humans.

In an interview in the June 2006 Discover Magazine(pp. 58-61), Wilson says that one reason he now rejects the "standard theory" he helped develop is that there's very little evidence that ants and termites in the early stages of evolution could determine who's a brother, sister, cousin, etc. He says: "They're not acting to favor collateral kin. The new view that I'm proposing is that it was group selection all along, an idea first roughly formulated by Darwin."

The key to Wilson's new theory is the relatively recent recognition that genes can be plastic in their expression, in response to different environmental conditions.

"So consider a gene", he writes, "that has placticity such that in one setting an individual carrying that gene becomes reproductive. Maybe this individual was the ant or wasp that arrived first, maybe it was the biggest one, or maybe it was the one to just by accident start laying eggs first. The important thing is that the reproductive role can shift from one colony to next and from one generation to the next. The group forms, and some individuals by circumstance become workers. Their cooperative behavior and the division of labor confer superiority on that group, with that particular gene, over other groups. It could be as simple as that."

Wilson explains that altruism is normally discouraged due to the fitness advantages of individual survival and reproduction, but it could pay for individuals to subvert their own interests to those of a group if the group is able to defend and exploit a very valuable resource (such as a hollow stem that could be a nest site). And once ants and termites became "fully social" they went on to dominate the world.

As for humans, Wilson agrees with Darwin that our evolution was largely a matter of "tribe against tribe" -- which might explain the endemic warfare AND altruism in which humans have engaged since prehistory. "The genes that favor this type of group cohesion would also favor an innate sense of morality and group loyalty. It would explain how so often group or tribe loyalty overrides even family loyalty."

Environment modifies genes

From Discover Magazine online

A Mother's Touch
Good parents can change children's DNA.
By Victor Limjoco
May 12, 2006 | Mind & Brain

Be grateful to your mom. Not only did she carry you around for nine months, but now new research suggests that her mothering style may have triggered genes that help determine your parenting style.

Columbia University neurobiologist Frances Champagne says that previous research across species showed that maternal behaviors are passed down from mother to daughter.

"So if your mother held you a lot, you will hold your infants a lot," Champagne says.

But she wanted to know whether mothering tendencies are passed on through genetics or experience. Her team studied mother rats that spent time licking and grooming their babies, and others that didn't.

As she wrote in the journal "Endocrinology," without enough licking and grooming, female rats had certain genes turn off, preventing the production of certain hormones key to future mothering behaviors, including estrogen and oxytocin, also known as the love hormone.

Licked rats had a higher production of those hormones, which, in turn, affected behavior when these baby rats became mothers themselves. Champagne says that this combination, genes and environment, pass maternal behaviors from generation to generation.

Champagne notes that maternal behavior is complex and that a mother's touch is just one part of a larger puzzle. But she says that these results highlight the need for bonding early in life. "Mothers are incredibly important," she says. "The quality of care that they can provide to infants is crucial for shaping infant development. And will have consequences for the next generation of mothers and infants."

Wednesday, May 24, 2006

Rapid gene change via behavior

Individuals with mutations in the tumor suppressor gene PTEN are prone not only to tumors but also to brain disorders, including macrocephaly (enlarged head circumference), seizures and mental retardation. Although PTEN mutations have been reported in autistic patients with macrocephaly, it is not clear whether there is a causal link between this gene and autistic spectrum disorders (ASD). Writing in Neuron, Chang-Hyuk Kwon and colleagues provide direct evidence that inactivation of Pten in mice results in neuropathological changes as well as abnormalities in social interaction.

Using several behavioral models, the researchers show that the Pten-mutant mice have deficits in social learning and interaction. For example, the mutant animals spent less time investigating the social target (a new mouse) compared with controls. When presented with a choice between the social target and an inaminate object, the Pten mutants spent similar amounts of time interacting with both. When the social target was removed and later re-introduced, the mutant mice, unlike their normal counterparts, did not reduce their interaction with it, indicating that they might have impaired social learning.

Our genes, like our environment, merely increases probabilities of certain states of affairs or behaviors without determining anything specific.

Sunday, May 21, 2006

Kwame Anthony Appiah on Cosmopolitanism

The fear is that the values and images of western mass culture, like some invasive weed, are threatening to choke out the world’s native flora.

The right approach, I think, starts by taking individuals – not nations, tribes or ‘people’ – as the proper object of moral concern. It doesn’t much matter what we call such a creed, but in homage to Diogenes, the fourth-century Greek Cynic and the first philosopher to call himself a ‘citizen of the world’, we could call it cosmopolitan. Cosmopolitans take cultural difference seriously, because they take choices individuals make seriously. But because difference is not the only thing that concerns them, they suspect that many of globalisation’s cultural critics are aiming at the wrong targets....

Yes, globalisation can produce homogeneity. But globalisation is also a threat to homogeneity.That prospect is unsettling for some people (just as it is exciting for others)....

Urbanity: the big, polyglot, diverse world of the city.

Human variety matters, cosmopolitans think, because people are entitled to options. What John Stuart Mill said over a century ago in On Liberty about diversity within a society serves just as well as an argument for variety across the globe: “If it were only that people have diversities of taste, that is reason enough for not attempting to shape them all after one model. But different persons also require different conditions for their spiritual development; and can no more exist healthily in the same moral, than all the variety of plants can exist in the same physical, atmosphere and climate. The same things which are helps to one person towards the cultivation of his higher nature, are hindrances to another… Unless there is a corresponding diversity in their modes of life, they neither obtain their fair share of happiness, nor grow up to the mental, moral and aesthetic stature of which their nature is capable.”

The textiles most people think of as traditional West African cloths are known as Java prints; they arrived in the 19th century with the Javanese batiks sold, and often milled, by the Dutch. The traditional garb of Herero women in Nambia derives from the attire of 19th-century German missionaries, though it is still unmistakably Herero, not least because the fabrics used have a distinctly un-Lutheran range of colours. And so with our [Ghanan] kente cloth: the silk was always imported, traded by Europeans, produced in Asia. This tradition was once an innovation. Should we reject it for that reason as untraditional? How far back must one go? Should we condemn the young men and women of the University of Science and Technology, a few miles outside Kumasi, who wear European-style gowns for graduation, lined with strips? Cultures are made of continuities and changes, and the identity of a society can survive through these changes. Societies without change aren’t authentic; they’re just dead.

From Cosmopolitanism: Ethics in a World of Strangers W.W. Norton Jan 2006

Comment from Izaak Van Gaalen: "Princeton philosophy professor Kwame Anthony Appiah attempts to articulate an ethical theory that applies to our current age of globalization. Taking as his starting point the writings of Diogenes, the 4th century Greek Cynic philosopher, Appiah develops a philosophy of cosmopolitanism modeled on Diogenes' "citizen of the world." A citizen of the world regards the individual rather than family, tribe, or nation as the primary focus of ethical agency. And that it is important to recognize that individuals are bound by belief systems and cultures that are not only different but may also be opposed to their own. Cosmopolitanism is an ethics somewhere between relativism and universalism that can build a working relationship between adherents of different belief systems enabling coexistance but not necessarily agreement.

This is not as easy as it sounds; in fact, it doesn't even sound easy. Cosmopolitanism is, in addition to Diogenes legacy, a product of the Enlightenment in that it celebrates diversity and multiculturalism; it is tolerant of diverse moralities. However, it is intolerant of those who would deny tolerance of this diversity or plurality. This is the central dilemma of cosmopolitanism. It attempts to reconcile liberal universal values with the values of those who disagree with them. Cosmopolitanism believes in the basic freedoms, including freedom of speech but it will curb any speech that calls for restricting that freedom.

'Cosmopolitans don't insist that everyone become cosmopolitan. They know they don't have all the answers. They're humble enough to think that they might learn from strangers; not too humble to think that strangers can't learn from them.'"

Comment from David E. McClean: "There is, so far, no better or more mature book on moral cosmopolitanism than Appiah's Cosmopolitanism: Ethics in a World of Strangers. In it, Appiah makes plain, by well-crafted appeals to the reader's good sense that are replete with ethnographic examples and real-world insights, what romantics and theologians have been telling us for ages: There is but a hair's breadth of difference between us; a tiny space that we can fill with causes for consternation and hatred, or with salutary joy at considering that difference. This Appiah does without in any way suggesting that there can ever be an end to the moral and cultural tensions that those differences do and must invite. He sketches the tenable cosmopolitanism we have been waiting for, and he parts company with the sentimentalist versions that remain - and should remain - in the shallow end of the pool.

Appiah, here as elsewhere (The Ethics of Identity), marvels that so many intellectuals have distorted the truth about the key insights of cosmopolitans, and he takes them to task. These have argued that cosmopolitanism contains an incredible and/or dangerous set of normative proposals and disregards the "facts" of human nature (that we are an insular species, with a territoriality that is red in tooth and claw). Appiah deftly replies that it is the cultural conservative, the jejune jingoist or nationalist, the duped hyper-contextualist, whose view of the world and of human nature is distorted, for the history of human social, cultural and even sexual intercourse is replete with cross-pollinations of language, religion, art, dress, rites, metaphysical outlooks, and progeny, all bespeaking an enormous aptitude for cooperation, bonding and friendship. We are an inter-cultural, intertwined, and interdependent species, just like every other on the planet. The view of ourselves as culturally isolated is the view that bears the burden of proof. It is, in fact, demonstrably false.

Appiah laments that so many philosophers and intellectuals, adopting a bad historicism, have argued, falsely, that we humans can only see the world up to the point of our own contextual "walls." He joins many - George Lakoff, Martha Nussbaum, William Sloane Coffin, Mohandis K. Gandhi, R.W. Emerson - in arguing that the greater truth of our humanity is our ability to imaginatively think new thoughts, to reconsider plans of life, to fashion new worlds of possibility, while acknowledging that each of us has a home that we should cherish, improve, perfect, and defend.

Sunday, May 14, 2006

Tom Wolfe on writing

Robert Cole [Director of the National Endowment for the Humanities]: Let me turn to your work. You've described yourself as a chronicler. What is that exactly?

Wolfe: Balzac enjoyed saying, "I am the secretary of French society," meaning a secretary who takes notes, not like the Secretary of Labor or something . . . He keeps tabs on what is happening in society, in the sense of social mores as well as just "society" with a small s. If I'm working well, I'm first and foremost bringing the news.

That was Nietzsche's expression when he said "God is dead." He said this is not a manifesto for atheism. He said, I'm just bringing you the news. I'm bringing you the news of the biggest event in modern history. God is dead, by which he meant, of course, that educated people were beginning to have no faith in God any longer. This was the 1880s. He predicted that in the twentieth century would come the rise of "barbaric nationalistic brotherhoods," leading to "wars such as have never been fought before." In other words, he predicted Nazism, Communism, and the world wars. Not bad, no matter what anybody thinks about his overarching take on life. In the twenty-first century, he said, would come the total collapse of all values.

He said if that happens, it will be worse than the world wars. He said the psychological devastation when people come to the point where they believe there is absolutely no meaning to life will be horrifying

Friday, May 12, 2006

Oochy woochy coochy coo

May 11th 2006
From The Economist
Women can read men like books

A GROUP of scientists has discovered that women are attracted to men who are fond of children. In years gone by, that announcement might have qualified for one of the late Senator William Proxmire's Golden Fleece awards for pointless scientific research—except that what this particular group of scientists has shown is that women can tell who is and is not fond of children just by looking at their faces.

The members of the group in question, led by James Roney of the University of California, Santa Barbara, are part of the revival of a science that once dared not speak its name—physiognomy. In the late 18th century, and during most of the 19th, it was believed that the shape of a person's head could tell you something about his character. Such deterministic thoughts fell out of favour during the 20th century. Most behavioural scientists thought that environment, not biology, shaped behaviour, and even those who did not could not see how the shape of the head or features of the face could possibly be relevant. What Dr Roney and his colleagues have found is that they are.
Their 39 male subjects, selected from a variety of ethnic backgrounds, were shown 20 pairs of pictures, each depicting an adult and an infant. They were asked to signify their preference for either the adult or the child. Some reported no interest in the child at all. The rest expressed a range of interest, including a few who always preferred the pictures of infants. The men also provided saliva swabs to assess their testosterone levels. The researchers then took digital photographs of the men and doctored the images so that their hairstyles were obscured, and could not affect the judgments of the female subjects.

These were a group of 29 women, from equally diverse backgrounds, who were shown the photographs. They were asked to rate the men according to whether they thought the men liked children, and whether those men appeared masculine and physically attractive. They were also asked to say which men they preferred for short-term and which for long-term relationships. The results, which have just been published in the Proceedings of the Royal Society, confirm that women are very good at reading faces.

The first part of the study provided confirmation of work done previously by other groups, using different methods. When asked to rate the men's masculinity, the women agreed on who was top and who was bottom, and their rankings correlated with the testosterone levels from the swabs. What was novel was that when asked to rate the men's liking of children from the photographs, they ranked them in the same order as the researchers had done from the interest the men themselves had shown in pictures of infants.

In physiognomic terms, the first result is easy to explain. Testosterone has multiple effects. When its production rises during puberty, it causes both body and mind to be reshaped, so it is little surprise that the former (square jaws and so on) reflect the latter (lust). But Dr Roney and his colleagues were unable to quantify what it was about the faces of the baby-friendly that signalled this attitude to women.

When asked with whom they would prefer to have a short-term relationship, women tended to pick the high-testosterone males. This makes sense from an evolutionary point of view, since testosterone suppresses the immune system. Like the proverbial peacock's tail, an excess of testosterone suggests that an individual must have particularly disease-resistant genes in order to compensate. These make desirable partners for a woman's own genes in her children. The problem with testosterone-fuelled males is that they are less likely to remain faithful to their partners.

By contrast, men who show an interest in children are also likely to make good partners, because they will care for their offspring. The study showed that women prefer these men for long-term relationships. Again, no surprise.

The surprise is this: some men were perceived both as masculine and as interested in children. From an evolutionary point of view, a trade-off between the two would have been predicted. That would produce what is known as an evolutionarily stable strategy in which the child-loving men father fewer babies to start with, but see as many live to maturity because they help to raise them rather than deserting the mothers. From the female point of view, the existence of men who are both hunky and child-friendly might seem too good to be true. For the men involved, it certainly seems like a lot of hard work.

Tuesday, May 09, 2006

The importance of delayed gratification

Self Control is the Key to Success

AROUND 1970, psychologist Walter Mischel launched a classic experiment. He left a succession of 4-year-olds in a room with a bell and a marshmallow. If they rang the bell, he would come back and they could eat the marshmallow. If, however, they didn't ring the bell and waited for him to come back on his own, they could then have two marshmallows.

In videos of the experiment, you can see the children squirming, kicking, hiding their eyes -- desperately trying to exercise self-control so they can wait and get two marshmallows. Their performance varied widely. Some broke down and rang the bell within a minute. Others lasted 15 minutes.

The children who waited longer went on to get higher SAT scores. They got into better colleges and had, on average, better adult outcomes. The children who rang the bell quickest were more likely to become bullies. They received worse teacher and parental evaluations 10 years later and were more likely to have drug problems at age 32.

The Mischel experiments are worth noting because people in the policy world spend a lot of time thinking about how to improve education, how to reduce poverty, how to make the most of the nation's human capital. But when policymakers address these problems, they come up with structural remedies: reduce class sizes, create more charter schools, increase teacher pay, mandate universal day care and try vouchers.

The results of these structural reforms are almost always disappointingly modest. Yet policymakers rarely ever probe deeper into problems and ask the core questions, such as how do we get people to master the sort of self-control that leads to success? To ask that question is to leave the policymakers' comfort zone -- which is the world of inputs and outputs, appropriations and bureaucratic reform -- and to enter the murky world of psychology and human nature....

If you're a policymaker and you are not talking about core psychological traits such as delayed gratification skills, then you're just dancing around with proxy issues. The research we do have on delayed gratification tells us that differences in self-control skills are deeply rooted but also malleable. Differences in the ability to focus attention and exercise control emerge very early, perhaps as soon as nine months. But there is no consensus on how much of the ability to exercise self-control is hereditary and how much is environmental....

What works, says Jonathan Haidt, the author of "The Happiness Hypothesis," is creating stable, predictable environments for children, in which good behavior pays off -- and practice. Young people who are given a series of tests that demand self-control get better at it.

This pattern would be too obvious to mention if it weren't so largely ignored by educators and policymakers. Somehow we've entered a world in which we obsess over structural reforms and standardized tests, but skirt around the moral and psychological traits that are at the heart of actual success. Mischel tried to interest New York schools in programs based on his research. Needless to say, he found almost no takers.

Race and War

White Guilt and the Western Past
Why is America so delicate with the enemy?

Tuesday, May 2, 2006
Wall Street Journal Online

There is something rather odd in the way America has come to fight its wars since World War II.

For one thing, it is now unimaginable that we would use anything approaching the full measure of our military power (the nuclear option aside) in the wars we fight. And this seems only reasonable given the relative weakness of our Third World enemies in Vietnam and in the Middle East. But the fact is that we lost in Vietnam, and today, despite our vast power, we are only slogging along--if admirably--in Iraq against a hit-and-run insurgency that cannot stop us even as we seem unable to stop it. Yet no one--including, very likely, the insurgents themselves--believes that America lacks the raw power to defeat this insurgency if it wants to. So clearly it is America that determines the scale of this war. It is America, in fact, that fights so as to make a little room for an insurgency.

Certainly since Vietnam, America has increasingly practiced a policy of minimalism and restraint in war. And now this unacknowledged policy, which always makes a space for the enemy, has us in another long and rather passionless war against a weak enemy.

Why this new minimalism in war?

It began, I believe, in a late-20th-century event that transformed the world more profoundly than the collapse of communism: the world-wide collapse of white supremacy as a source of moral authority, political legitimacy and even sovereignty. This idea had organized the entire world, divided up its resources, imposed the nation-state system across the globe, and delivered the majority of the world's population into servitude and oppression. After World War II, revolutions across the globe, from India to Algeria and from Indonesia to the American civil rights revolution, defeated the authority inherent in white supremacy, if not the idea itself. And this defeat exacted a price: the West was left stigmatized by its sins. Today, the white West--like Germany after the Nazi defeat--lives in a kind of secular penitence in which the slightest echo of past sins brings down withering condemnation. There is now a cloud over white skin where there once was unquestioned authority.

I call this white guilt not because it is a guilt of conscience but because people stigmatized with moral crimes--here racism and imperialism--lack moral authority and so act guiltily whether they feel guilt or not.

They struggle, above all else, to dissociate themselves from the past sins they are stigmatized with. When they behave in ways that invoke the memory of those sins, they must labor to prove that they have not relapsed into their group's former sinfulness. So when America--the greatest embodiment of Western power--goes to war in Third World Iraq, it must also labor to dissociate that action from the great Western sin of imperialism. Thus, in Iraq we are in two wars, one against an insurgency and another against the past--two fronts, two victories to win, one military, the other a victory of dissociation.

The collapse of white supremacy--and the resulting white guilt--introduced a new mechanism of power into the world: stigmatization with the evil of the Western past. And this stigmatization is power because it affects the terms of legitimacy for Western nations and for their actions in the world. In Iraq, America is fighting as much for the legitimacy of its war effort as for victory in war. In fact, legitimacy may be the more important goal. If a military victory makes us look like an imperialist nation bent on occupying and raping the resources of a poor brown nation, then victory would mean less because it would have no legitimacy. Europe would scorn. Conversely, if America suffered a military loss in Iraq but in so doing dispelled the imperialist stigma, the loss would be seen as a necessary sacrifice made to restore our nation's legitimacy. Europe's halls of internationalism would suddenly open to us.

Because dissociation from the racist and imperialist stigma is so tied to legitimacy in this age of white guilt, America's act of going to war can have legitimacy only if it seems to be an act of social work--something that uplifts and transforms the poor brown nation (thus dissociating us from the white exploitations of old). So our war effort in Iraq is shrouded in a new language of social work in which democracy is cast as an instrument of social transformation bringing new institutions, new relations between men and women, new ideas of individual autonomy, new and more open forms of education, new ways of overcoming poverty--war as the Great Society. This does not mean that President Bush is insincere in his desire to bring democracy to Iraq, nor is it to say that democracy won't ultimately be socially transformative in Iraq. It's just that today the United States cannot go to war in the Third World simply to defeat a dangerous enemy.

White guilt makes our Third World enemies into colored victims, people whose problems--even the tyrannies they live under--were created by the historical disruptions and injustices of the white West. We must "understand" and pity our enemy even as we fight him. And, though Islamic extremism is one of the most pernicious forms of evil opportunism that has ever existed, we have felt compelled to fight it with an almost managerial minimalism that shows us to be beyond the passions of war--and thus well dissociated from the avariciousness of the white supremacist past.

Anti-Americanism, whether in Europe or on the American left, works by the mechanism of white guilt. It stigmatizes America with all the imperialistic and racist ugliness of the white Western past so that America becomes a kind of straw man, a construct of Western sin. (The Abu Ghraib and Guantanamo prisons were the focus of such stigmatization campaigns.) Once the stigma is in place, one need only be anti-American in order to be "good," in order to have an automatic moral legitimacy and power in relation to America. (People as seemingly disparate as President Jacques Chirac and the Rev. Al Sharpton are devoted pursuers of the moral high ground to be had in anti-Americanism.) This formula is the most dependable source of power for today's international left. Virtue and power by mere anti-Americanism. And it is all the more appealing since, unlike real virtues, it requires no sacrifice or effort--only outrage at every slight echo of the imperialist past.

Today words like "power" and "victory" are so stigmatized with Western sin that, in many quarters, it is politically incorrect even to utter them. For the West, "might" can never be right. And victory, when won by the West against a Third World enemy, is always oppression. But, in reality, military victory is also the victory of one idea and the defeat of another. Only American victory in Iraq defeats the idea of Islamic extremism. But in today's atmosphere of Western contrition, it is impolitic to say so.

America and the broader West are now going through a rather tender era, a time when Western societies have very little defense against the moral accusations that come from their own left wings and from those vast stretches of nonwhite humanity that were once so disregarded.

Europeans are utterly confounded by the swelling Muslim populations in their midst. America has run from its own mounting immigration problem for decades, and even today, after finally taking up the issue, our government seems entirely flummoxed. White guilt is a vacuum of moral authority visited on the present by the shames of the past. In the abstract it seems a slight thing, almost irrelevant, an unconvincing proposition. Yet a society as enormously powerful as America lacks the authority to ask its most brilliant, wealthy and superbly educated minority students to compete freely for college admission with poor whites who lack all these things. Just can't do it.

Whether the problem is race relations, education, immigration or war, white guilt imposes so much minimalism and restraint that our worst problems tend to linger and deepen. Our leaders work within a double bind. If they do what is truly necessary to solve a problem--win a war, fix immigration--they lose legitimacy.

To maintain their legitimacy, they practice the minimalism that makes problems linger. What but minimalism is left when you are running from stigmatization as a "unilateralist cowboy"? And where is the will to truly regulate the southern border when those who ask for this are slimed as bigots? This is how white guilt defines what is possible in America. You go at a problem until you meet stigmatization, then you retreat into minimalism.

Possibly white guilt's worst effect is that it does not permit whites--and nonwhites--to appreciate something extraordinary: the fact that whites in America, and even elsewhere in the West, have achieved a truly remarkable moral transformation. One is forbidden to speak thus, but it is simply true. There are no serious advocates of white supremacy in America today, because whites see this idea as morally repugnant. If there is still the odd white bigot out there surviving past his time, there are millions of whites who only feel goodwill toward minorities.

This is a fact that must be integrated into our public life--absorbed as new history--so that America can once again feel the moral authority to seriously tackle its most profound problems. Then, if we decide to go to war, it can be with enough ferocity to win.

Mr. Steele, a research fellow at the Hoover Institution at Stanford University, is author, most recently, of "White Guilt: How Blacks and Whites Together Destroyed the Promise of the Civil Rights Era," published this week by HarperCollins.