|Researchers discovered genetic evidence that human evolution is speeding up - and has not halted or proceeded at a constant rate, as had been thought - indicating that humans on different continents are becoming increasingly different.|
"We used a new genomic technology to show that humans are evolving rapidly, and that the pace of change has accelerated a lot in the last 40,000 years, especially since the end of the Ice Age roughly 10,000 years ago," says research team leader Henry Harpending, a distinguished professor of anthropology at the University of Utah.
Harpending says there are provocative implications from the study, published online Monday, Dec. 10 in the journal Proceedings of the National Academy of Sciences:
-- "We aren't the same as people even 1,000 or 2,000 years ago," he says, which may explain, for example, part of the difference between Viking invaders and their peaceful Swedish descendants. "The dogma has been these are cultural fluctuations, but almost any temperament trait you look at is under strong genetic influence."
-- "Human races are evolving away from each other," Harpending says. "Genes are evolving fast in Europe, Asia and Africa, but almost all of these are unique to their continent of origin. We are getting less alike, not merging into a single, mixed humanity." He says that is happening because humans dispersed from Africa to other regions 40,000 years ago, "and there has not been much flow of genes between the
regions since then."
"Our study denies the widely held assumption or belief that modern humans [those who widely adopted advanced tools and art] appeared 40,000 years ago, have not changed since and that we are all pretty much the same. We show that humans are changing relatively rapidly on a scale of centuries to millennia, and that these changes are different in different continental groups."
The increase in human population from millions to billions in the last 10,000 years accelerated the rate of evolution because "we were in new environments to which we needed to adapt," Harpending adds. "And with a larger population, more mutations occurred."
Study co-author Gregory M. Cochran says: "History looks more and more like a science fiction novel in which mutants repeatedly arose and displaced normal humans - sometimes quietly, by surviving starvation and disease better, sometimes as a conquering horde. And we are those mutants."
Harpending conducted the study with Cochran, a New Mexico physicist, self-taught evolutionary biologist and adjunct professor of anthropology at the University of Utah; anthropologist John Hawks, a former Utah postdoctoral researcher now at the University of Wisconsin, Madison; geneticist Eric Wang of Affymetrix, Inc. in Santa Clara, Calif.; and biochemist Robert Moyzis of the University of California, Irvine.
Thursday, December 06, 2007
Monday, November 12, 2007
Educators and psychologists have long feared that children entering school with behavior problems were doomed to fall behind in the upper grades. But two new studies suggest that those fears are exaggerated.
One concluded that kindergartners who are identified as troubled do as well academically as their peers in elementary school. The other found that children with attention deficit disorders suffer primarily from a delay in brain development, not from a deficit or flaw.
Experts say the findings of the two studies, being published Tuesday in separate journals, could change the way scientists, teachers and parents understand and manage children who are disruptive or emotionally withdrawn in the early years of school. The studies might even prompt a reassessment of the possible causes of disruptive behavior in some children.
“I think these may become landmark findings, forcing us to ask whether these acting-out kinds of problems are secondary to the inappropriate maturity expectations that some educators place on young children as soon as they enter classrooms,” said Sharon Landesman Ramey, director of the Georgetown University Center on Health and Education, who was not connected with either study.
In one study, an international team of researchers analyzed measures of social and intellectual development from more than 16,000 children and found that disruptive or antisocial behaviors in kindergarten did not correlated with academic success at the end of elementary school.
Kindergartners who interrupted the teacher, defied instructions and even picked fights were performing as well in reading and math as well-behaved children of the same abilities when they both reached fifth grade, the study found....
In the other study, researchers from the National Institute of Mental Health and McGill University, using imaging techniques, found that the brains of children with attention-deficit hyperactivity disorder developed normally but more slowly in some areas than the brains of children without the disorder.
The disorder, also known as A.D.H.D., is by far the most common psychiatric diagnosis given to disruptive young children; 3 percent to 5 percent of school-age children are thought to be affected. Researchers have long debated whether it was a result of a brain deficit or to a delay in development.
Doctors said that the report, being published in The Proceedings of the National Academy of Sciences, helps to explain why so many children grow out of the diagnosis in middle school or later, often after taking stimulant medications to improve concentration in earlier grades.
The findings in the first study grew out of a collaboration among a dozen leading researchers to reassess data from six large child-development studies performed since 1970. Each of these six studies tracked hundreds of children from an early age through elementary school on a number of measures, from reading and math skills to emotional stability and concentration, or attention. Most of the studies used teachers’ reports to gauge students’ emotional and social progress and their ability to pay attention when asked.
The researchers adjusted the findings to eliminate the influence of factors like family income and family structure.
While there was little correlation between behavior problems in kindergarten and later academic success, the researchers did find that scores on math tests at ages 5 or 6 were highly correlated with academic success in fifth grade. Kindergarten reading skills and scores on attention measures — where youngsters with A.D.H.D. falter — also predicted later academic success, but less strongly than math scores did. The pattern was about the same in girls as in and boys, and for children from affluent families as well as those from lower-income groups.
The authors of the study suggested that preschool programs might consider developing more effective math training. The findings should also put to rest concerns that boys and girls who are restless, disruptive or withdrawn in kindergarten are bound to suffer academically.
“For kindergarten, it appears teachers are able to work around these behavior problems in a way that enables kids to learn just much as other kids with equal levels of ability,” said the lead author, Greg J. Duncan, a professor of human development and social policy at Northwestern University....
In the second study, government psychiatric researchers compared brain scans from two groups of children: one with attention deficit disorder, the other without. The scientists had tracked the children — 223 in each group — from ages 6 to 16, taking multiple scans on each child.
In a normally developing brain, the cerebral cortex — the outer wrapping, where circuits involved in conscious thought are concentrated — thickens during early childhood. It then reverses course and thins out, losing neurons as the brain matures through adolescence. The study found that, on average, the brains of children with A.D.H.D. began this “pruning” process at age 10 ½, about three years later than their peers....
But the greatest delays in brain maturation were found in precisely those areas of the cortex most involved in attention and motor control, said the lead author of the study, Dr. Philip Shaw, a psychiatrist at the National Institute of Mental Health. “Those are exactly the areas where we would expect to find differences,” he said....
“The basic sequence of development in the brains of these kids with A.D.H.D. was intact, absolutely normal,” Dr. Shaw said. “I think this is pretty strong evidence we’re talking about a delay, and not an abnormal brain.”
About three in four children do grow out of the problem by early adulthood, he said.
Tuesday, October 23, 2007
New research indicates babies are born with violent tendencies that most learn to control
It is not the cartoons that make your kids smack playmates or violently grab their toys but, rather, a lack of social skills, according to new research.
"It's a natural behavior and it's surprising that the idea that children and adolescents learn aggression from the media is still relevant," says Richard Tremblay, a professor of pediatrics, psychiatry and psychology at the University of Montreal, who has spent more than two decades tracking 35,000 Canadian children (from age five months through their 20s) in search of the roots of physical aggression. "Clearly youth were violent before television appeared."
Tremblay's previous results have suggested that children on average reach a peak of violent behavior (biting, scratching, screaming, hitting…) around 18 months
Tremblay on Wednesday is set to present preliminary study results showing a genetic signature consistent with chronic violent behavior at a meeting of The Royal Society, the U.K.'s academy of science, in London.
"We're looking at to what extent the chronically aggressive individuals show differences in terms of gene expressions compared to those on the normal trajectory," he told ScientificAmerican.com. "The individuals that are chronically aggressive have…more genes that are not expressed." The fact that a gene can be silenced or the level of protein it encodes reduced, he added, "is an indication that the problem is at a very basic level."
When children first begin to poke, prod and even slap, parents, teachers and siblings often react by indicating that those behaviors are inappropriate. But, citing studies done in animals, Tremblay notes that an unfit environment beginning in the womb may affect a child's ability to learn this lesson in the first place. And he plans to extend his genetic studies to include expectant mothers to determine if their behavior during pregnancy is linked to the down tuning of genes that may be associated with chronic aggression.
"In the long studies we've been doing, we've measured a number of characteristics during pregnancy and after birth that are good predictors'' of chronic aggression in children, Tremblay notes. Possible factors that might influence neurobiological development of the fetus, he says, include smoking, drinking, poor nutrition and excessive stress.
Tremblay speculates that genes play a significant role: for instance damaged genes may make it hard for children to acquire language, frustrating them and making them prone to violence, among other means of making themselves heard. "When you don't master language," Tremblay says, "it's hard to get people to understand what you want."
Kate Keenan, an associate professor of psychiatry at the University of Chicago, views this new genetic analysis as the logical next step in Tremblay's long-term exploration into childhood aggression. She believes Tremblay's work may help uncover genetic profiles distinct to chronically aggressive children that may allow researchers to answer questions like, "Can we differentiate [between these kids] even earlier?" [and] "How early can you intervene?"
Saturday, August 25, 2007
New York Times
Is There Anything Good About Men? And Other Tricky Questions
By John Tierney
What percentage of your ancestors were men?
No, it’s not 50 percent, as I’ll explain shortly. But first let me credit the
source, Roy F. Baumeister, who answered that question – and a lot of other ones
– in an address on Friday at the annual convention of the American Psychological
Association in San Francisco. I recommend reading the whole speech: "Is There
Anything Good About Men?"
As you might expect, he did find something good to say about men, but the
speech wasn’t an apologia for the gender, or a whine about the abuse heaped on
men. Rather, it was a shrewd and provocative look at the motivational
differences between men and women – and at some of the topics (like the gender
imbalance on science faculties) that got Larry Summers in so much trouble at
Harvard. Dr. Baumeister, a prominent social psychologist who teaches at Florida
State University, began by asking gender warriors to go home.
"I’m certainly not denying that culture has exploited women," he said. "But
rather than seeing culture as patriarchy, which is to say a conspiracy by men to
exploit women, I think it’s more accurate to understand culture (e.g., a
country, a religion) as an abstract system that competes against rival systems —
and that uses both men and women, often in different ways, to advance its
The "single most underappreciated fact about gender," he said, is the ratio of
our male to female ancestors. While it’s true that about half of all the people
who ever lived were men, the typical male was much more likely than the typical
woman to die without reproducing. Citing recent DNA research, Dr. Baumeister
explained that today’s human population is descended from twice as many women as
men. Maybe 80 percent of women reproduced, whereas only 40 percent of men did.
"It would be shocking if these vastly different reproductive odds for men and
women failed to produce some personality differences," he said, and continued:
For women throughout history (and prehistory), the odds of reproducing have
been pretty good. Later in this talk we will ponder things like, why was it so
rare for a hundred women to get together and build a ship and sail off to
explore unknown regions, whereas men have fairly regularly done such things? But
taking chances like that would be stupid, from the perspective of a biological
organism seeking to reproduce. They might drown or be killed by savages or catch
a disease. For women, the optimal thing to do is go along with the crowd, be
nice, play it safe. The odds are good that men will come along and offer sex and
you’ll be able to have babies. All that matters is choosing the best offer.
We’re descended from women who played it safe.
For men, the outlook was radically different. If you go along with the crowd
and play it safe, the odds are you won’t have children. Most men who ever lived
did not have descendants who are alive today. Their lines were dead ends. Hence
it was necessary to take chances, try new things, be creative, explore other
The second big motivational difference between the genders, he went on,
involves the kind of social relationships sought by each sex. While other
researcher have argued that women are more "social" than men – more helpful and
less aggressive towards others — Dr. Baumeister argued that women can be plenty
aggressive in the relationships that matter most to them, which are intimate
relationships. Men are more aggressive when it comes to dealing with strangers,
because they’re more interested than women are in a wider network of shallow
"We shouldn’t automatically see men as second-class human beings simply
because they specialize in the less important, less satisfying kind of
relationship," he said. Men are social, too, he said, just in a different way,
with more focus on larger groups: "If you make a list of activities that are
done in large groups, you are likely to have a list of things that men do and
enjoy more than women: team sports, politics, large corporations, economic
networks, and so forth."
There’s lots more in the speech, but I’ll leave you with Dr. Baumeister’s
conclusion summarizing his argument:
A few lucky men are at the top of society and enjoy the culture’s best
rewards. Others, less fortunate, have their lives chewed up by it. Culture uses
both men and women, but most cultures use them in somewhat different ways. Most
cultures see individual men as more expendable than individual women, and this
difference is probably based on nature, in whose reproductive competition some
men are the big losers and other men are the biggest winners. Hence it uses men
for the many risky jobs it has.
Men go to extremes more than women, and this fits in well with culture using
them to try out lots of different things, rewarding the winners and crushing the
Culture is not about men against women. By and large, cultural progress
emerged from groups of men working with and against other men. While women
concentrated on the close relationships that enabled the species to survive, men
created the bigger networks of shallow relationships, less necessary for
survival but eventually enabling culture to flourish. The gradual creation of
wealth, knowledge, and power in the men’s sphere was the source of gender
inequality. Men created the big social structures that comprise society, and men
still are mainly responsible for this, even though we now see that women can
perform perfectly well in these large systems.
What seems to have worked best for cultures is to play off the men against
each other, competing for respect and other rewards that end up distributed very
unequally. Men have to prove themselves by producing things the society values.
They have to prevail over rivals and enemies in cultural competitions, which is
probably why they aren’t as lovable as women.
The essence of how culture uses men depends on a basic social insecurity. This
insecurity is in fact social, existential, and biological. Built into the male
role is the danger of not being good enough to be accepted and respected and
even the danger of not being able to do well enough to create offspring.
The basic social insecurity of manhood is stressful for the men, and it is
hardly surprising that so many men crack up or do evil or heroic things or die
younger than women. But that insecurity is useful and productive for the
culture, the system.
Again, I’m not saying it’s right, or fair, or proper. But it has worked. The
cultures that have succeeded have used this formula, and that is one reason that
they have succeeded instead of their rivals.
Wednesday, August 22, 2007
|By Rebecca Morelle|
BBC News science reporter
Modern people possess less prominent features but higher foreheads than our medieval ancestors. Writing in the British Dental Journal, the team took careful measurements of groups of skulls spanning across 30 generations. The scientists said the differences between past and present skull shapes were "striking".
The team used radiographic films of skulls to record extensive measurements taken by a computer.They looked at 30 skulls dating from the mid-14th Century. They had come from the unlucky victims of the plague. The skulls had been excavated from plague pits in the 1980s in London.
Another 54 skulls examined by the team were recovered from the wreck of the Mary Rose which sank off the south coast of England in 1545.
All the skulls were compared with 31 recent orthodontic records from the School of Dentistry in Birmingham.
Dr Peter Rock, lead author of the study and director of orthodontistry at Birmingham University, told the BBC News website: "The astonishing finding is the increased cranial vault heights.
"The increase is very considerable. For example, the vault height of the plague skulls were 80mm, and the modern ones were 95mm - that's in the order of 20% bigger, which is really rather a lot."
He suggests that the increase in size may be due to an increase in mental capacity over the ages. [emphasis added] ...
Published: 2006/01/25 09:12:59 GMT
Monday, August 20, 2007
By Leigh Dayton
August 21, 2007 02:00am
IT'S official. Blue is the most popular colour and women really do prefer pink, and reddish shades of blue like lilac and purple.
And the preference isn't just a result of social stereotypes, pushing pink on girls and blue on boys. It's innate and occurs across cultures, claim British researchers who studied the colour preferences of 208 young adults: 171 Britons and 37 mainland Chinese.
"Although we expected to find sex differences, we were surprised at how robust they were, given the simplicity of ourtest," said visual neuroscientist Anya Hurlbert of Newcastle University at Newcastle upon Tyne.
Along with psychologist Yazhu Ling, Professor Hurlbert asked volunteers to select, as quickly as possible, their preferred colour from each of a series of paired, coloured rectangles. They reported yesterday in the journal Current Biology that the most popular colour by far was blue.
"On top of that, females have a preference for the red end ofthe red-green axis, and this shifts their colour preference slightly away from blue towards red, which tends to make pinksand lilacs the most preferred colours in comparison with others," Professor Hurlbert said.
The finding was so strong that observers could pick the sex of people based upon their colour preferences alone.
"It's a fairly nice piece of evidence," commented Rob Brooks, an evolutionary biologist with the University of NSW in Sydney.
"Anyone with a son or daughter would probably get the sense that (colour preference) is not all socialised. My little girl loves pink and I don't know where the hell it comes from," Professor Brooks said.
Thursday, July 19, 2007
By J. Philippe Rushton
August 12, 2004
Over a century ago, Sir Francis Galton initiated research into individual and race differences in intelligence and temperament. He was the first to propose the study of human twins and of selective breeding in animals to disentangle the effects of heredity and environment. And it was Galton—who spent several years exploring in what is now Namibia as a young man—who first contrasted the talkative impulsivity of Africans with the taciturn reserve of American Indians, and the placidity of the Chinese.
Galton further noted that these temperament differences persisted irrespective of climate (from the frozen north through the torrid equator), and religion, language, or political system (whether self-ruled or governed by the Spanish, Portuguese, English or French).
Anticipating later studies of transracial adoptions, Galton observed that the majority of individuals adhered to racial type even after being raised by White settlers.
In my book Race, Evolution, and Behavior, I review the evidence accumulated since Galton’s pioneering studies. This shows that his views were largely correct. Twin and adoption studies (such as those of identical twins raised apart by Professor Thomas J. Bouchard Jr. at the University of Minnesota) show that traits like Extraversion and Neuroticism are substantially heritable.
Temperamental differences, measured objectively by activity recorders attached to arms and legs, show up in babies. African babies are more active sooner and develop earlier than White babies who, in turn, are more active than East Asian babies. Motor behavior is a highly stable individual difference variable. Even among Whites, activity level measured during free play shows highly significant negative correlations with IQ: more restrained children average higher intellects.
Parallel results are found in four- to six-year-olds using teacher ratings. One study carried out in Quebec, Canada, had teachers rate immigrant children in French language preschools. The teachers reported more outgoing temperament among children of African descent than among those of European descent, and especially than in those of East Asian descent.
The racial differences in temperament are also found on standardized personality tests. Blacks consistently score more outgoing, active, socially dominant, and impulsive than do Whites, while Whites consistently score more active and socially dominant than do East Asians.
It may be surprising to learn that Blacks also have higher self-esteem than Whites and East Asians. This is true even when Blacks are poorer and less educated. In one large study of 11- to 16-year-olds, Blacks rated themselves as more attractive than did Whites. Blacks also rated themselves higher in reading, science and social studies (but not in mathematics).
The Blacks said this even though they knew they had lower actual academic achievement scores than White children.
In contrast, East Asian students, even though they score higher in academic achievement than Whites, often score lower in self-esteem.
What I am suggesting then, is that Blacks have a self-assured "bright" talkative, personality, which leads many people to over-estimate their abstract reasoning ability. East Asians provide a "compare and contrast" case study with people under-estimating their IQ because of their quietness and otherwise "subdued" personality profile. East Asians who average higher than Whites on IQ tests (107 versus 100) have often been described to me as seeming "dull and uncreative" compared to Whites, achieving what they do only through unimaginative rote learning, imitation, and memorization.
The relative restraint of East Asians contrasted with the noisiness of Africans is apparent to anyone visiting their home continents. When the New York Yankees played the first game of the 2004 baseball season before a packed stadium in Tokyo, Japan, the announcers noted how very much quieter the crowd was than those at games in the U.S. But it was a more tranquil disposition, not a lack of interest in the game, which hushed the stands.
Because of the time difference, people all over Japan regularly get up at two in the morning to view games broadcast from the U.S. featuring American teams which include Japanese-born stars.
Like any other group, Whites look upon themselves as the norm. Whites tend not to speak up if they don’t know the answer to a question. Nor do they like to intrude on the privacy of others. They erroneously assume that, because Africans are talkative, they must know what they are talking about.
The flipside is the reticence and reserve of East Asians. In the realm of behavior, English traditionally uses the same term, "dumb," both for being unable to speak and for being stupid or silly (though both usages are quite Politically Incorrect these days). In the case of the average mental ability of East Asians, dumb is hardly dumber!
The converse is that the greater talkativeness of Blacks does not indicate brightness—it often masks a low ability to reason abstractly.
Thursday, July 12, 2007
A population of butterflies has evolved in a flash on a South Pacific
island to fend off a deadly parasite.
The proportion of male Blue Moon butterflies dropped to a precarious 1
percent as the parasite targeted males. Then, within the span of a mere 10
generations, the males evolved an immunity that allowed their population
share to soar to nearly 40 percent-all in less than a year.
"We usually think of natural selection as acting slowly, over hundreds or
thousands of years," said study team member Gregory Hurst, an evolutionary
geneticist at the University College London. "But the example in this study
happened in a blink of the eye, in terms of evolutionary time."
The scientists think the males developed genes that hold a male-killing
microbial parasite, called Wolbachia, at bay.
The results, detailed in the July 13 issue of the journal Science,
illustrate the power of positive natural selection on "suppressor" genes
that thwart the lethal bacteria, allowing the male butterflies to bounce back.
Sylvain Charlat of the University of California, Berkeley, and the
University College London, along with colleagues, studied the sex ratios of
Hypolimnas bolina butterflies on the Samoan islands of Upolu and Savaii,
where males had dwindled to 1 percent of the populations in 2001.
The likely culprit was a male-killing parasite, Wolbachia, which lives
inside the butterfly's reproductive cells, preferably female sex cells.
With a female host, Wolbachia can hitch a ride to the next generation
aboard the mother's eggs. Since males are "useless" for the bacteria's
survival, the parasite kills male embryos.
But the male butterflies found a way to stealthily overcome the parasites.
At the beginning of 2006, the scientists found the males made up about 40
percent of Upolu's butterfly population.
On Savaii, females still dominated the Blue Moon butterfly population (99
percent) at the start of 2006, but by the year's end, males made up nearly
The team ran genetic analyses to see if the parasite had somehow vanished.
It hadn't. Wolbachia was still present in butterflies from both islands.
Other lab experiments indicated the males had evolved suppressor genes to
shield against the parasite.
Unlike genetic tweaks that alter wing color or antennae length, mutations
that affect a population's sex ratio can have a significant impact on the
biology of a species, the scientists say.
"The suppressor gene allows infected females to produce males," Charlat
said. "These males will mate with many, many females, and the suppressor
gene will therefore be in more and more individuals over generations."
The study was funded by the U.S. National Science Foundation, the U.K.
Natural Environment Research Council and the Natural Sciences and
Engineering Research Council of Canada.
© 1999-2007 Imaginova Corp. All rights reserved.
Wednesday, July 04, 2007
Men may still have more power in the workplace, but apparently women really are "the boss" at home. That's according to a new study by a team of Iowa State University researchers.
The study of 72 married couples from Iowa found that wives, on average, exhibit greater situational power -- in the form of domineering and dominant behaviors -- than their husbands during problem-solving discussions, regardless of who raised the topic. All of the couples in the sample were relatively happy in their marriages, with none in counseling at the time of the study.
Associate Professor of Psychology David Vogel and Assistant Professor of Human Development and Family Studies Megan Murphy led the research. The ISU research team also included Associate Professor of Human Development and Family Studies Ronald Werner-Wilson, Professor of Psychology Carolyn Cutrona -- who is director of the Institute for Social and Behavioral Research at Iowa State -- and Joann Seeman, a graduate student in psychology. They authored a paper titled "Sex Differences in the Use of Demand and Withdraw Behavior in Marriage: Examining the Social Structure Hypothesis," which appeared in last month's issue of the Journal of Counseling Psychology -- a professional journal published by the American Psychological Association.
Wives have the marriage power
"The study at least suggests that the marriage is a place where women can exert some power," said Vogel. "Whether or not it's because of changing societal roles, we don't know. But they are, at least, taking responsibility and power in these relationships. So at least for relatively satisfied couples, women are able to take some responsibility and are able to exert some power -- but it's hard for us to say why that's so."
"Women are responsible for overseeing the relationship -- making sure the relationship runs, that everything gets done, and that everybody's happy," said Murphy, "And so, maybe some of that came out in our findings in terms of women domineering and dominating -- that they were taking more responsibility for the relationship, regardless of whose topic was being discussed."
The researchers solicited participation from married couples in and around the Iowa State campus. On average, spouses were around 33 years of age and had been married for seven years. Most participants were European Americans (66%), followed by Asian (22%), Hispanic (5%), and African-American (4%) -- with the final three percent representing other nationalities.
Each spouse was asked to independently complete a questionnaire on relationship satisfaction and an assessment of overall decision-making ability in the relationship. Each spouse also was asked to identify a problem in their relationship -- an issue in which he or she desired the most change and which could not be resolved without the spouse's cooperation. Spouses were then asked to answer some questions about their chosen topics, including the type of problem-solving behaviors that generally take place when this topic arises, and the importance of the topic. Couples were then brought together and asked to discuss each of the problem topics for 10 minutes apiece -- discussions that were videotaped. The researchers did not participate in the discussion.
"We actually just asked them to start talking about the issue, and then we left the room," said Vogel. "And so they were all by themselves in the room talking. We were as non-obtrusive as possible. We just came back at the end of the period of time, and asked them to talk about the other topic."
At the end of the discussions, couples were separated again. Each spouse was then debriefed and discussed his or her feelings and reactions to the study.
The researchers reviewed and coded the videotapes of couples' interactions using a widely-accepted interaction rating system. The system consists of five dimensions to calculate demand and withdraw behaviors -- avoidance, discussion, blame, pressure for change, and withdraws.
Not all talk and no action
The researchers concluded in their paper that wives behaviorally exhibited more domineering attempts and were more dominant -- i.e., more likely to have their partner give in -- than husbands during discussions of either spouse's topic. That refuted their initial premise that sex differences in marital power would favor husbands.
Vogel said that wives weren't simply talking more than their husbands in discussions, but actually were drawing favorable responses from their husbands to what they said.
"That's what I think was particularly interesting," he said. "It wasn't just that the women were bringing up issues that weren't being responded to, but that the men were actually going along with what they said. They (women) were communicating more powerful messages and men were responding to those messages by agreeing or giving in."
"There's been research that suggests that's a marker of a healthy marriage -- that men accept influence from their wives," said Murphy.
The study was funded, in part, by the National Institute of Mental Health, along with ISU. Vogel and Murphy hope to expand upon this research through a future study.
Source: Iowa State University
Thursday, June 28, 2007
June 26, 2007
By NICHOLAS WADE
Historians often assume that they need pay no attention to human evolution because the process ground to a halt in the distant past. That assumption is looking less and less secure in light of new findings based on decoding human DNA.
People have continued to evolve since leaving the ancestral homeland in northeastern Africa some 50,000 years ago, both through the random process known as genetic drift and through natural selection. The genome bears many fingerprints in places where natural selection has recently remolded the human clay, researchers have found, as people in the various continents adapted to new diseases, climates, diets and, perhaps, behavioral demands.
A striking feature of many of these changes is that they are local. The genes under selective pressure found in one continent-based population or race are mostly different from those that occur in the others.These genes so far make up a small fraction of all human genes.
A notable instance of recent natural selection is the emergence of lactose tolerance — the ability to digest lactose in adulthood — among the cattle-herding people of northern Europe some 5,000 years ago. Lactase, the enzyme that digests the principal sugar of milk, is usually switched off after weaning. But because of the great nutritional benefit for cattle herders of being able to digest lactose in adulthood, a genetic change that keeps the lactase gene switched on spread through the population.
Lactose tolerance is not confined to Europeans. Last year, Sarah Tishkoff of the University of Maryland and colleagues tested 43 ethnic groups in East Africa and found three separate mutations, all different from the European one, that keep the lactase gene switched on in adulthood. One of the mutations, found in peoples of Kenya and Tanzania, may have arisen as recently as 3,000 years ago.
That lactose tolerance has evolved independently four times is an instance of convergent evolution. Natural selection has used the different mutations available in European and East African populations to make each develop lactose tolerance. In Africa, those who carried the mutation were able to leave 10 times more progeny, creating a strong selective advantage.
Researchers studying other single genes have found evidence for recent evolutionary change in the genes that mediate conditions like skin color, resistance to malaria and salt retention.
The most striking instances of recent human evolution have emerged from a new kind of study, one in which the genome is scanned for evidence of selective pressures by looking at a few hundred thousand specific sites where variation is common.
Last year Benjamin Voight, Jonathan Pritchard and colleagues at the University of Chicago searched for genes under natural selection in Africans, Europeans and East Asians. In each race, some 200 genes showed signals of selection, but without much overlap, suggesting that the populations on each continent were adapting to local challenges....
The findings suggest that Europeans and East Asians acquired their pale skin through different genetic routes and, in the case of Europeans, perhaps as recently as around 7,000 years ago....
Two years ago, Bruce Lahn, a geneticist at the University of Chicago, reported finding signatures of selection in two brain-related genes of a type known as microcephalins, because when mutated, people are born with very small brains. Two of the microcephalins had come under selection in Europeans and one in Chinese, Dr. Lahn reported.
He suggested that the selected forms of the gene had helped improved cognitive capacity and that many other genes, yet to be identified, would turn out to have done the same in these and other populations....
A genomic survey of world populations by Dr. Feldman, Noah Rosenberg and colleagues in 2002 showed that people clustered genetically on the basis of small differences in DNA into five groups that correspond to the five continent-based populations: Africans, Australian aborigines, East Asians, American Indians and Caucasians, a group that includes Europeans, Middle Easterners and people of the Indian subcontinent. The clusterings reflect “serial founder effects,” Dr. Feldman said, meaning that as people migrated around the world, each new population carried away just part of the genetic variation in the one it was derived from.
The new scans for selection show so far that the populations on each continent have evolved independently in some ways as they responded to local climates, diseases and, perhaps, behavioral situations.
The concept of race as having a biological basis is controversial, and most geneticists are reluctant to describe it that way. But some say the genetic clustering into continent-based groups does correspond roughly to the popular conception of racial groups.
“There are difficulties in where you put boundaries on the globe, but we know now there are enough genetic differences between people from different parts of the world that you can classify people in groups that correspond to popular notions of race,” Dr. Pritchard said.
David Reich, a population geneticist at the Harvard Medical School, said that the term “race” was scientifically inexact and that he preferred “ancestry.” Genetic tests of ancestry are now so precise, he said, that they can identify not just Europeans but can distinguish between northern and southern Europeans. Ancestry tests are used in trying to identify genes for disease risk by comparing patients with healthy people. People of different races are excluded in such studies. Their genetic differences would obscure the genetic difference between patients and unaffected people.
No one yet knows to what extent natural selection for local conditions may have forced the populations on each continent down different evolutionary tracks. But those tracks could turn out to be somewhat parallel. At least some of the evolutionary changes now emerging have clearly been convergent, meaning that natural selection has made use of the different mutations available in each population to accomplish the same adaptation.
This is the case with lactose tolerance in European and African peoples and with pale skin in East Asians and Europeans.
Thursday, June 21, 2007
(AP) -- Children at the top of the pecking order - either by birth or because their older siblings died - score higher on IQ tests than their younger brothers or sisters. The question of whether firstborn and only children are really smarter than those who come along later has been hotly debated for more than a century. Norwegian researchers now report that it isn't a matter of being born first, but growing up the senior child, that seems to result in the higher IQ scores.
Petter Kristensen and Tor Bjerkedal report their findings in Friday's issue of the journal Science.
It's a matter of what they call social rank in the family - the highest scores were racked by the senior child - the first born or, if the first born had died in infancy, the next oldest.
Kristensen, of Norway's National Institute of Occupational Health, and Bjerkedal, of the Norwegian Armed Forces Medical Services, studied the IQ test results of 241,310 Norwegians drafted into the armed forces between 1967 and 1976. All were aged 18 or 19 at the time.
The average IQ of first-born men was 103.2, they found.
Second-born men averaged 101.2, but second-born men whose older sibling died in infancy scored 102.9.
And for third-borns, the average was 100.0. But if both older siblings died young, the third-born score rose to 102.6.
The findings provide "evidence that the relation between birth order and IQ score is dependent on the social rank in the family and not birth order as such," they concluded.
It's an issue that has perplexed experts since at least 1874, when Sir Francis Galton reported that men in prominent positions tend to be firstborns more often than would have been statistically expected.
Since then, several studies have confirmed higher intelligence scores for firstborns, while other analyses have questioned those findings and the methodology of those reports.
Frank J. Sulloway of the Institute for Personality and Social Research at the University of California, Berkeley, welcomed what he called the Norwegians' "elegantly designed" analysis.
"These two researchers demonstrate that how study participants were raised, not how they were born, is what actually influences their IQs," said Sulloway, who was not part of the research team.
The elder child pulls ahead, he said, perhaps as a result of learning gained through the process of tutoring younger brothers and sisters.
The older child benefits by having to organize and express its thoughts to tutor youngsters, he said, while the later born children may have no one to tutor.
© 2006 The Associated Press.
Friday, May 11, 2007
An international research team consisting of Patrik Lindenfors, Charles Nunn and Robert Barton examined data on primate brain structures in relation to traits important for male competition, such as greater body mass and larger canine teeth. The researchers also took into account the typical group size of each sex for individual primate species in order to assess sex-specific sociality - the tendency to associate with others and form social groups. The researchers then studied the differences between 21 primate species, which included chimpanzees, gorillas, and rhesus monkeys, using statistical techniques that incorporate evolutionary processes.
The authors found that sexual selection had an important influence on primates’ brains. Greater male-on-male competition (sexual selection) correlated with several brain structures involved with autonomic functions, sensory-motor skills and aggression. Where sexual selection played a greater role the septum was smaller, and therefore potentially exercised less control over aggression.
In contrast, the average number of females in a social group correlates with the relative size of the telencephalon (or cerebrum), the largest part of the brain. The telencephalon includes the neocortex, which is responsible for higher functions such as sensory perception, generation of motor commands and spatial reasoning.
Primates with the most sociable females evolved a larger neocortex, suggesting that female social skills may yield the biggest brains for the species as a whole. Social demands on females and competitive demands on males require skills handled by different brain components, the authors suggest. The contrasting brain types, a result of behavioural differences between the sexes, might be a factor in other branches of mammalian brain evolution beyond anthropoid primates, too.
Article: Primate Brain Architecture and Selection in Relation to Sex, Patrik Lindenfors, Charles L Nunn and Robert A Barton, BMC Biology
Source: BioMed Central
Saturday, March 03, 2007
Physical differences in the brain may increase the chances of a person choosing to take drugs, say Cambridge University scientists.
A study of rats showed variations in brain structure pre-dated their first exposure to narcotics, and made them more likely to opt for cocaine.
Writing in Science, the team say genes may affect these differences in humans.
Treatments to reduce their effect may be found - but a test of vulnerability to drugs is unlikely, they add.
Up to 500,000 people are currently addicted to Class A drugs such as cocaine, heroin and amphetamines, according to government figures.
One of the most important questions in the science of addiction surrounds the origin of differences noticed in the brains of human drug users.
While these differences are thought to be important in the way humans respond to drugs, it is difficult to prove whether they are a part of the natural brain chemistry of that individual, or have developed as a result of taking the drugs themselves.
To unravel this problem, the Cambridge researchers scanned the brains of rats, and found similar differences in 'neurotransmitter receptors' in certain parts of the brain.
Some of the animals had far fewer 'dopamine receptors' - the brain structures onto which drugs such as cocaine and heroin latch to produce their effect.
The scientists used a game in which the rats had to wait to press a button and receive a reward, coupled with detailed brain scans, to see if those with the fewest dopamine receptors were impulsive, a type of behaviour often linked with drug use in humans.
This was the case - even in rats which had no contact with drugs.
When the 'impulsive' rats were introduced to the drugs, and given the opportunity to take them, they were much more likely to do so than the rats with more dopamine receptors.
Dr Jeff Dalley, who led the study funded by the Medical Research Council and the Wellcome Trust, said that this showed clearly that the brain differences, and the impulsivity linked to them, pre-dated any exposure to drugs, with the possibility that the situation in human drug users could be the same.
"What we are talking about here is a possible physical trait producing vulnerability to drug use.
"The next step is identifying the gene or genes that cause this diminished supply of brain receptors.
"This may provide important new leads in the search for improved therapies for attention deficit/hyperactivity disorder (ADHD) and compulsive brain disorders such as drug addiction and pathological gambling."
But he said that the reasons for humans becoming addicted to any drug were more complex than simply their genetic make-up, and that a test for any gene uncovered by further research would not necessarily work.
"There are lots of reasons unconnected to genes why people use drugs, and I can't see that any test would be useful."
Lesley King-Lewis, Chief Executive of Action on Addiction, said: "It is well known that some personality traits are associated with a vulnerability to cocaine and other addiction problems.
"This study is extremely interesting because it has identified a biological basis in rats for some of the behaviours that we know are associated and shows how they can lead to drug addiction."
Dr Gerome Breen, from the Institute of Psychiatry in London, said that the differences found in the rats were very likely to have their equivalent in the human brain.
"This is a very exciting study which has successfully identified the biological basis of some of the behaviours that we know are associated with higher risk of cocaine and other addictions in humans.
"It also pinpoints a potential cause of relapse in abstinent drug users - what makes them start taking a drug again despite all the problems they know it will cause them.
"This means that we can start to investigate treatments that, at least partially, correct this deficit in the hope that these will prove successful in preventing relapse."
In recent years, biological science has proposed a new paradigm. The latest research shows that resilience can best be understood as an interplay between particular genes and environment — GxE, in the lingo of the field. Researchers are discovering that a particular variation of a gene can help promote resilience in the people who have it, acting as a buffer against the ruinous effects of adversity. In the absence of an adverse environment, however, the gene doesn't express itself in this way. It drops out of the psychological picture. "We now have well-replicated findings showing that genes play a major role in influencing people's responses to adverse environments," says Sir Michael Rutter, a leading British psychiatrist ...
Genes and stressed-out parents lead to shy kids
New research from the Child Development Laboratory at the University of Maryland shows that shyness in kids could relate to the manner in which a stress-related gene in children interacts with being raised by stressed-out parents.
In a study published in the February issue of Current Directions in Psychological Science, Nathan Fox, professor and director of the Child Development Laboratory, and his team found that kids who are consistently shy while growing up are particularly likely to be raised by stressed-out parents, and to possess a genetic variant associated with stress sensitivity.
This suggests that shyness relates to interactions between genes and the environment, as opposed to either genes or the environment acting alone. "Moms who report being stressed are likely to act differently toward their child than moms who report little stress," said Fox. "A mom under stress transfers that stress to the child. However, each child reacts to that stress somewhat differently. Our study found that genes play a role in this variability, such that those children who have a stress-sensitive variant of a serotonin-related gene are particularly likely to appear shy while growing up when they also are raised by mothers with high levels of stress.
"We don't understand how the environment directly affects the gene, but we know that the gene shows particularly strong relationships to behavior in certain environments."
Like all genes, the particular serotonin-related gene examined in this study has 2 alleles, which can be long or short. The protein produced by the short form of the gene is known to predispose towards some forms of stress sensitivity.
Fox's research found that among children exposed to a mother's stress, it was only those who also inherited the short forms of the gene who showed consistently shy behavior.
"If you have two short alleles of this serotonin gene, but your mom is not stressed, you will be no more shy than your peers as a school age child," says Fox. "But we found that when stress enters the picture, the gene starts to show a strong relationship to the child's behavior," says Fox. "If you are raised in a stressful environment, and you inherit the short form of the gene, there is a higher likelihood that you will be fearful, anxious or depressed."
Fox's group studies how genes and the family environment work together to shape the development of social competence in infants and young children. "We are particularly interested in very shy children. What keeps them shy and what may change them from being shy to not being shy anymore?
"We identify these children early in the first years of life, but it's not enough to identify a child with a certain disposition or gene. We want to understand how the environment works together with genes, what are the mechanisms that shape behavior."
Source: Association of Psychological Science
Saturday, January 20, 2007
Why is Respectable Opinon so sure that there isn't the slightest kernel of truth in Afrocentrist rantings about African Sun People and European Ice People? I'm not saying that Dr. Lionel Jeffries knows anything about biochemistry, but I am saying that there seems to be some sort of correlation between gloomy, cold weather and gloomy, cold personalities, just like there is between sunny, warm weather and sunny, warm personalities. And that if the chemical at work is not melanin, it's worth finding out what it is.
Personally, I don't know whether being tanned keeps me happy (as "melanin science" would suggest), but getting tanned sure lifts my mood for at least a few hours. What is the biochemical mechanism behind this?
Further, there seems to be a very rough but real relationship between latitude and attitude, with hotter climes correlating with hotter moods. This is a consistent theme through most literature at least since Shakespeare, with his hot-blooded Italians and melancholy Danes.
I read an article about how Tromso, Norway, the farthest Northern small city in the world, has no higher rates of seasonal depression than elsewhere. But, it appears from reading the article that Tromsonites have evolved a culture of self-therapy emphasizing near-mandatory conviviality during winter and bright artificial lights.
Further, self-selection is no doubt going on with people who can't stand the winters getting out of Tromso and others who don't mind them migrating in. If there is a genetic component to Seasonal Affective Disorder, this self-selection of darkness-likers will accumulate over the generations.
There's no doubt a big cultural component in this latitude-attitude correlation. For example, a culture is more likely to develop the charming tradition of shooting guns off in the air to celebrate (e.g., more Baghdadites were killed by falling bullets during peace celebrations at the end of the Iran-Iraq war than were killed by Iranian missile attacks during the eight year war) if it's not 20 below outside. In places where it's too cold to go outside, a culture will emphasize developing the kind of self-restraint that keeps you from blasting holes in the ceiling. There is probably also a biological component, but it's not clear if it's hereditary or environmental. In other words, when Jimmy Buffett sings that changes in latitude mean changes in attitude, is he correct for within an individual, or just across ethnic groups?