Quantcast
Channel: Evo and Proud
Viewing all 353 articles
Browse latest View live

French lesson

$
0
0


A burning car during the 2005 riots. (Wikicommons: Strologoff)

 

The gruesome attack on Charlie Hebdo has earned condemnation around the world. It has been called "cowardly" and "evil" by Barack Obama, "a barbaric act" by Stephen Harper, and an "infamy" by François Hollande.

Yes, violence is serious. It's a crime when done by an individual and war when done by a country. It's a grave breach of the rules that govern our society. Whatever differences we may have, they are to be settled peacefully, through the courts if need be. Violence is just not to be done.

Except it increasingly is. The attack on Charlie Hebdo is not an isolated incident. It's part of a worsening trend of violence by people described as jeunes[youths] or simply not described at all. That was not the case in the recent attack; the victims were too well known. But it is generally the case, and this conspiracy of silence has become something of a social norm, particularly in the media.

Yet statistics do exist, notably those compiled by the Gendarmerie. According to French criminologist Xavier Raufer:

The criminality we are talking about is the kind that is making life unbearable for the population: burglaries, thefts of all sorts, assaults, violent thefts without firearms, etc. In these specific cases, 7 out of 10 of these crimes are committed by people who in one way or another have an immigrant background, either directly (first generation on French territory, with or without a residence permit) or indirectly (second generation). (Chevrier and Raufer, 2014)

The word "immigrant" is misleading. Many if not most are French-born, and they tend to come much more from some immigrant groups than from others. In general, they are young men of North African or sub-Saharan African background, plus smaller numbers of Roma and Albanians. 

This criminality, when not being denied, is usually put down to social marginalization and lack of integration. Yet the reverse is closer to the truth. The typical French person is an individual in a sea of individuals, whereas immigrant communities enjoy strong social networks and a keen sense of solidarity. This is one of the reasons given why the targets of the crime wave are so often Français de souche[old-stock French]. "Whites don't stick up for each other."


Personal violence in human societies

In France, as in other Western countries, personal violence is criminalized and even pathologized. The young violent male is said to be "sick." Or "deprived." He has not had a chance to get a good job and lead a nice quiet life.

Yet this is not how young violent males perceive themselves or, for that matter, how most human societies have perceived them down through the ages. Indeed, early societies accepted the legitimacy of personal violence. Each adult male had the right to defend himself and his kin with whatever violence he deemed necessary. The term "self-defence" is used loosely here—a man could react violently to a lack of respect or to slurs on his honor or the honor of his ancestors. There were courts to arbitrate this sort of dispute but they typically had no power, enforcement of court rulings being left to the aggrieved party and his male kin. In general, violence was a socially approved way to prove one’s manhood, attract potential mates, and gain respect from other men.

Things changed as human societies developed. The State grew in power and increasingly monopolized the legitimate use of violence, thus knocking down the violent young male from hero to zero. This course of action was zealously pursued in Northwest Europe from the 11th century onward (Carbasse, 2011, pp. 36-56). There were two reasons. First, the end of the Dark Ages brought a strengthening of State power, a resumption of trade and, hence, a growing need and ability by the authorities to pacify social relations. Second, the main obstacle to criminalization of personal violence—kin-based morality and the desire to avenge wrongs committed against kin—seems to have been weaker in Northwest Europe than elsewhere. There was correspondingly a greater susceptibility to more universal and less kin-based forms of morality, such as the Christian ban on murder in almost all circumstances. 

Murder was increasingly punished not only by the ultimate penalty but also by exemplary forms of execution, e.g., burning at the stake, drawing and quartering, and breaking on the wheel (Carbasse, 2011, pp. 52-53). This "war on murder" reached a peak from the 16th to 18th centuries when, out of every two hundred men, one or two would end up being executed (Taccoen, 1982, p. 52). A comparable number of murderers would die either at the scene of the crime or in prison while awaiting trial (Ireland, 1987).


Gene-culture co-evolution?

The cultural norm thus shifted toward nonviolence. There was now strong selection against people who could not or would not lead peaceful lives, their removal from society being abrupt, via the hangman's noose, or more gradual, through ostracism by one's peers and rejection on the marriage market. As a result, the homicide rate fell from between 20 and 40 homicides per 100,000 in the late Middle Ages to between 0.5 and 1 per 100,000 in the mid-20th century (Eisner, 2001, pp. 628-629).

Was this decline due solely to legal and cultural restraints on personal violence? Or were there also changes to the gene pool? Was there a process of gene-culture co-evolution whereby Church and State created a culture of nonviolence, which in turn favored some genotypes over others? We know that aggressive/antisocial behavior is moderately to highly heritable. In the latest twin study, heritability was 40% when the twins had different evaluators and 69% when they had the same one (Barker et al., 2009). The actual neural basis is still unsure. Perhaps a predisposition to violence is due to stronger impulsiveness and weaker internal controls on behavior (Niv et al., 2012). Perhaps the threshold for expression of violence is lower. Perhaps ideation comes easier (van der Dennen, 2006). Or perhaps the sight and smell of blood is more pleasurable (vanden Bergh and Kelly, 1964).

It was probably a mix of cultural and genetic factors that caused the homicide rate to decline in Western societies. Even if culture alone were responsible, we would still be facing the same problem. Different societies view male violence differently:

In Algerian society for example, children are raised according to their sex. A boy usually receives an authoritarian and severe type of upbringing that will prepare him to become aware of the responsibilities that await him in adulthood, notably responsibility for his family and for the elderly. This is why a mother will allow her son to fight in the street and will scarcely be alarmed if the boy has a fall or if she sees a bruise. The boy of an Algerian family is accustomed from an early age to being hit hard without whimpering too much. People orient him more toward combat sports and group games in order to arm him with courage and endurance—virtues deemed to be manly. (Assous, 2005)

In Algeria and similar societies, a shaky equilibrium contains the worst excesses of male violence. Men think twice before acting violently, for fear of retaliation from the victim's brothers and other kinsmen. Of course, this "balance of terror" does not deter violence against those who have few kinsmen to count on.

Problems really begin, however, when a culture that legitimizes male violence coexists with one that delegitimizes it. This is France’s situation. Les jeunes perceive violence as a legitimate way to advance personal interests, and they eagerly pursue this goal with other young men. Conversely, les Français de souche perceive such violence as illegitimate and will not organize collectively for self-defence. The outcome is predictable. The first group will focus their attacks on members of the second group—not out of hate but because the latter are soft targets who cannot fight back or get support from others. 

But what about the obviously Islamist motives of the Charlie Hebdo attackers? Such motives can certainly channel violent tendencies, but those tendencies would exist regardless. Even if we completely eradicated radical Islam, les jeuneswould still be present and still engaging in the same kind of behavior that is becoming almost routine. At best, there would be fewer high-profile attacks—the kind that make the police pull out all stops to find and kill the perps. It is this "high end" that attracts the extremists, since they are the least deterred by the risks incurred. The “low end” tends to attract devotees of American hip hop. Keep in mind that less than two-thirds of France's Afro/Arab/Roma population is even nominally Muslim.


Conclusion

Modern France is founded on Western principles of equality, human betterment, and universal morality. Anyone anywhere can become French. That view, the official one, seems more and more disconnected from reality. Many people living in France have no wish to become French in any meaningful sense. By "French" I don't mean having a passport, paying taxes, or agreeing to a set of abstract propositions. I mean behaving in certain concrete ways and sharing a common culture and history.

This reality is sinking in, and with it a loss of faith in the official view of France. Faith can be restored, on the condition that outrageous incidents stop happening. But they will continue to happen. And they will matter a lot more than the much more numerous incidents tout court—the rising tide of thefts, assaults, and home invasions that are spreading deeper and deeper into areas that were safe a few years ago. The attack on Charlie Hebdo matters more because it cannot be hidden from public view and public acknowledgment. How does one explain the disappearance of an entire newspaper and the mass execution of its editorial board? 

The Front national will be the beneficiary, of course. It may already have one third of the electorate, but that's still not enough to take power, especially with all of the other parties from the right to the left combining to keep the FN out. Meanwhile, the Great Replacement proceeds apace, regardless of whether the government is "left-wing" or "right-wing."


References 

Assous, A. (2005). L'impact de l'éducation parentale sur le développement de l'enfant, Hawwa, 3(3), 354-369.
http://booksandjournals.brillonline.com/content/journals/10.1163/156920805774910033 

Barker, E.D., H. Larsson, E. Viding, B. Maughan, F. Rijsdijk, N. Fontaine, and R. Plomin. (2009). Common genetic but specific environmental influences for aggressive and deceitful behaviors in preadolescent males, Journal of Psychopathology and Behavioral Assessment, 31, 299-308.
http://www.researchgate.net/publication/226851959_Common_Genetic_but_Specific_Environmental_Influences_for_Aggressive_and_Deceitful_Behaviors_in_Preadolescent_Males/file/9fcfd506c1944288cb.pdf  

Chevrier, G. and X. Raufer. (2014). Aucun lien entre immigration et délinquance ? Une France peu généreuse avec ses immigrés ? Radiographie de quelques clichés "bien pensants"à la peau dure, Atlantico, November 26
http://www.atlantico.fr/decryptage/aucun-lien-entre-immigration-et-delinquance-france-peu-genereuse-avec-immigres-radiographie-quelques-cliches-bien-pensants-peau-1875772.html  

Eisner, M. (2001). Modernization, self-control and lethal violence. The long-term dynamics of European homicide rates in theoretical perspective, British Journal of Criminology, 41, 618-638.
http://www.researchgate.net/publication/249284795_Modernization_Self-Control_and_Lethal_Violence._The_Long-term_Dynamics_of_European_Homicide_Rates_in_Theoretical_Perspective/file/60b7d52cbfa9aec78c.pdf

Ireland, R.W. (1987). Theory and practice within the medieval English prison, The American Journal of Legal History, 31, 56-67. 

Niv, S., C. Tuvblad, A. Raine, P. Wang, and L.A. Baker. (2012). Heritability and longitudinal stability of impulsivity in adolescence, Behavior Genetics, 42, 378-392.
http://europepmc.org/articles/PMC3351554

Taccoen, L. (1982). L'Occident est nu, Paris: Flammarion. 

Vanden Bergh, R.L., and J.F. Kelly. (1964). Vampirism. A review with new observations. Archives of General Psychiatry, 11, 543-547.
http://archpsyc.jamanetwork.com/article.aspx?articleid=488664  

Van der Dennen, J.M.G. (2006). Review essay: The murderer next door: Why the mind is designed to kill, Homicide Studies, 10, 320-335.
http://hsx.sagepub.com/content/10/4/320.short

Moving on ...

$
0
0
Dear readers,

I've decided to complete my move to The Unz Review (www.unz.com), so there will be no further blogging at this site. I'm doing this partly to reduce my workload of supervising two websites and partly to gain more control over my posts at TUR (commenting, correction of errors in the post, etc.). Thanks to Ron, my exposure on the Internet has greatly increased and my audience now includes a silent readership of mainstream journalists and, perhaps, newspaper editors.

Early last year, I had thoughts of closing up shop. I had the feeling of making the same points over and over again. I still have that feeling, but it no longer troubles me so much. For one thing, the same point can have many different applications in real life. For another, a lot of people have short memories, and it doesn't hurt to repeat a point I may have made two or three years ago.

Thank you for your loyalty! This isn't the end; it's just a move onward and upward.

Coming home

$
0
0
Photo by Shawn


Dear readers,

I have returned to my old website, after being expelled from The Unz Review. The immediate cause was my decision to close commenting on my last column. A catfight was developing between myself and Ron Unz in the comments, and I wanted to give the two of us time to cool off. I also needed time to write my next column, which would have replied in detail to two of his criticisms. Ron felt I was violating his freedom of speech and promptly blocked access to my author's account.

This may be all for the best. In the heat of anger, people say certain things they had previously kept quiet about. Ron was still resentful over my criticisms of his article "Race, IQ, and Wealth," which he had published some three years ago (Unz, 2012; Frost, 2012a, 2012b, 2012c). I thought his thinking had evolved since then, particularly with the latest findings by Piffer (2013, see also Frost, 2014). Apparently not. 

It's important not to lose sight of the big picture. The Unz Review is still doing good work by assisting writers like Steve Sailer and Razib Khan, and others may join them in the future. But it is very unlikely that I will return there.


Note 

I don't wish to be drawn into a tedious argument over "I say, he says." For what it's worth, I did not delete any comments that Ron made in reply to my criticisms of his 2012 article. He may be thinking of an email exchange between the two of us, which was partly reproduced in Frost (2012b). In the time I've known him, I've only deleted one of his comments, and that was the one he left after I had closed commenting on my last column.


References 

Frost, P. (2012a). Ron Unz on Race, IQ, and Wealth, Evo and Proud, July 21,
http://www.evoandproud.blogspot.ca/2012/07/ron-unz-on-race-iq-and-wealth.html 

Frost, P. (2012b). More on Race, IQ, and Wealth, Evo and Proud, July 28
http://www.evoandproud.blogspot.ca/2012/07/more-on-race-iq-and-wealth.html 

Frost, P. (2012c). He who pays the piper, Evo and Proud, August 18
http://www.evoandproud.blogspot.ca/2012/08/he-who-pays-piper.html 

Frost, P. (2014). Population differences in intellectual capacity: a new polygenic analysis, Evo and Proud, March 8
http://evoandproud.blogspot.ca/2014/03/population-differences-in-intellectual.html 

Piffer, D. (2013). Factor analysis of population allele frequencies as a simple, novel method of detecting signals of recent polygenic selection: The example of educational attainment and IQ, Interdisciplinary Bio Central, provisional manuscript
http://www.ibc7.org/article/journal_v.php?sid=312 

Unz, R. (2012). Race, IQ, and Wealth, The American Conservative, July 18.
http://www.theamericanconservative.com/articles/race-iq-and-wealth/

In the wrong place at the wrong time?

$
0
0

Dick Turpin was convicted of robbery but had also been guilty of a string of murders (Wikicommons)


In each generation from 1500 to 1750, between 1 and 2% of all English men were executed either by court order or extra-judicially (at the scene of the crime or while in prison). This was the height of a moral crusade by Church and State to punish the wicked so that the good may live in peace.

Meanwhile, the homicide rate fell ten-fold. Were the two trends related? In a recent paper, Henry Harpending and I argued that a little over half of the homicide decline could be explained by the high execution rate, and its steady removal of violent males from the gene pool. The rest could be partly explained by Clark-Unz selection—violent males lost out reproductively because they were increasingly marginalized in society and on the marriage market. Finally, this decline was also due to a strengthening of controls on male violence: judicial punishment (policing, penitentiaries); quasi-judicial punishment (in schools, at church, and in workplaces); and stigmatization of personal violence in popular culture.

These controls drove the decline in the homicide rate, but they also tended over time to hardwire the new behavior pattern, by hindering the ability of violent males to survive and reproduce. The last half-century has seen a dramatic relaxation of these controls but only a modest rise in the homicide rate among young men of native English origin.

The above argument has been criticized on two grounds:

1. Executed offenders were not the worst of the worst. They were often people caught in the wrong place at the wrong time.

2. Executed offenders may have had children who survived to adulthood.

This week's column will address the first criticism. Did execution remove the most violent men? Or did it randomly remove individuals from, say, the most violent third?

Many genetic factors influence our propensity for personal violence: impulse control; violence ideation; pleasure from inflicting pain; etc. Regardless of how strong or weak these factors may be, the propensity itself should be normally distributed within the male population—it should follow a bell curve. If we move right or left from the population mean, the number of men should initially decline very little, with the result that over two-thirds of the men can be found within one standard deviation of the mean.

We really have to go one standard deviation to the right before the men begin to seem abnormally violent, but the remaining right-hand “tail” leaves us only 16% of the male population. What if we’re looking for a man who’s at least twice as violent as the normal two-thirds? He’s in the far right 1%. In a single gene pool, violent men stand out not just because they are noticeably abnormal but also because they are much less common.

Identifying the most violent men. But how?

Were these men the ones that the English justice system executed between 1500 and 1750? Murder is violence taken to its logical extreme, yet most murder cases went unsolved in early modern England. The crime was difficult to prove for want of witnesses, either because none wished to come forward or because they had likewise been murdered. There were no police, no forensic laboratories, and much less of the investigative infrastructure that we have today. If you committed a one-time murder, your chances of not getting caught were good.

The criminal justice system in the eighteenth century [...] therefore operated on a rationale very different from that of a modern state, with its professional police forces, social services and a fully bureaucratised law-enforcement system. In the early eighteenth century at least, the enforcement of law and order depended largely on unpaid amateur officials, the justices of the peace and the parish constables and other local officers. (Sharpe, 2010, p. 92)

This is not to say that the justice system gave murder a lower priority. Rather, with the limited resources available, judges and juries engaged in "profiling." They scrutinized not only the offence but also the accused—his character and demeanor, his behavior during the crime and in the courtroom, and his previous offences. Juries could be lenient in cases of first-time offences for theft, but this leniency disappeared if the accused had a criminal history.

The justice system thus looked for signs that the accused had already committed worse crimes or would go on to do so. Ironically, our current system is the one that tends to catch people who were in the wrong place at the wrong time, i.e., inexperienced one-time murderers.

Hanged for robbery but guilty of murder

This may be seen in a book, published in London in 1735, that told the life stories of 198 executed criminals. Of the 198, only 34 (17%) had been sentenced to death for murder. A much larger number, 111 (56%), were charged with robbery, being described as highwaymen, footpads, robbers, and street robbers. Finally, another 37 (19%) were executed simply for theft (Hayward, 2012; see note). Robbery was punished more severely than simple theft because it threatened both life and property, especially if the victim failed to cooperate sufficiently or seemed to recognize the robber.

Robbery is the taking away violently and feloniously the goods or money from the person of a man, putting him in fear [...]. Yea, where there is a gang of several persons, only one of which robs, they are all guilty as to the circumstance of putting in fear, wherever a person attacks another with circumstances of terror [...] And in respect of punishment, though judgment of death cannot be given in any larceny whatsoever, unless the goods taken exceed twelve pence in value, yet in robbery such judgment is given, let the value of the goods be ever so small. (Hayward, 2013, p. 27)

Sooner or later, a robber ended up killing. We see this in the life story of Dick Turpin, who was hanged for cattle theft, even though he had committed worse crimes:

The process of reconstruction may not tell us much about Turpin's personality, but it does give us the opportunity to put together a remarkable criminal biography, a tale of violent robberies, of murder, and, eventually, of the horse-thefts that led to his execution. (Sharpe, 2010, p. 8)

Allegations of murder came up in trials of robbers, but typically remained unproven because no witnesses could be produced. Nonetheless, the accused would sometimes confess to murder, either to clear his conscience or, in the wake of a death sentence, because he had nothing left to lose, like this man convicted for highway robbery: "This Reading had been concerned in abundance of robberies, and, as he himself owned, in some which were attended with murder" (Hayward, 2013, p. 91). A member of another gang, when caught, confessed to a long string of murders:

[...] he, without any equivocation, began to confess all the crimes of his life. He said that it was true they all of them deserved death, and he was content to suffer; he said, moreover, that in the course of his life he had murdered upwards of three-score with his own hands. He also carried the officers to an island in the river, which was the usual place of the execution of those innocents who fell into the hands of their gang [...] (Hayward,2013, p. 1014)

In most cases, however, the accused would deny involvement in murders even after being condemned to death:

There has been great suspicions that he murdered the old husband to this woman, who was found dead in a barn or outhouse not far from Hornsey; but Wigley, though he confessed an unlawful correspondence with the woman, yet constantly averred his innocency of that fact, and always asserted that though the old man's death was sudden, yet it was natural. (Hayward, 2013, pp. 92-93)

At the place of execution he behaved with great composure and said that as he had heard he was accused in the world of having robbed and murdered a woman in Hyde Park, he judged it proper to discharge his conscience by declaring that he knew nothing of the murder, but said nothing as to the robbery. (Hayward, 2013, p.96)

In the wrong place at the wrong time?

If we look at executed criminals, their profile is not that of unfortunates caught in the wrong place at the wrong time. Most were young men who had done their work in the company of likeminded young men. Those who operated alone were atypical, like this highwayman:

Though this malefactor had committed a multitude of robberies, yet he generally chose to go on such expeditions alone, having always great aversion for those confederacies in villainy which we call gangs, in which he always affirmed there was little safety, notwithstanding any oaths, by which they might bind themselves to secrecy. (Hayward, 2013, p. 93)

For most, long-term safety was a secondary concern. Their behavioral profile—fast life history, disregard for the future, desire to be with other young men and impress them with acts of bravado and violence—stood in contrast to the ascendant culture of early modern England. One example is this robber:

[...] when he returned to liberty he returned to his old practices. His companions were several young men of the same stamp with himself, who placed all their delight in the sensual and brutal pleasures of drinking, gaming, whoring and idling about, without betaking themselves to any business. Natt, who was a young fellow naturally sprightly and of good parts, from thence became very acceptable to these sort of people, and committed abundance of robberies in a very small space of time. The natural fire of his temper made him behave with great boldness on such occasions, and gave him no small reputation amongst the gang. [...] He particularly affected the company of Richard James, and with him robbed very much on the Oxford Road, whereon it was common for both these persons not only to take away the money from passengers, but also to treat them with great inhumanity [...] (Hayward, 2013, pp. 108-109)

This sort of description comes up repeatedly. Most condemned men struck observers as very atypical, and not merely among the worst third of society. In 1741, an observer described a hanging and the interactions between the condemned men and a crowd composed largely of their friends:

The criminals were five in number. I was much disappointed at the unconcern and carelessness that appeared in the faces of three of the unhappy wretches; the countenance of the other two were spread with that horror and despair which is not to be wondered at in men whose period of life is so near [...]

[...] the three thoughtless young men, who at first seemed not enough concerned, grew most shamefully wanton and daring, behaving themselves in a manner that would have been ridiculous in men in any circumstances whatever. They swore, laughed, and talked obscenely, and wished their wicked companions good luck with as much assurance as if their employment had been the most lawful.

At the place of execution the scene grew still more shocking, and the clergyman who attended was more the subject of ridicule than of their serious attention. The Psalm was sung amidst the curses and quarrelling of hundreds of the most abandoned and profligate of mankind, upon them (so stupid are they to any sense of decency) all the preparation of the unhappy wretches seems to serve only for subject of a barbarous kind of mirth, altogether inconsistent with humanity. And as soon as the poor creatures were half dead, I was much surprised to see the populace fall to hauling and pulling the carcasses with so much earnestness as to occasion several warm rencounters and broken heads. These, I was told, were the friends of the persons executed, or such as, for the sake of tumult, chose to appear so; as well as some persons sent by private surgeons to obtain bodies for dissection. The contests between these were fierce and bloody, and frightful to look at [...] The face of every one spoke a kind of mirth, as if the spectacle they beheld had afforded pleasure instead of pain, which I am wholly unable to account for. (Hayward, 2013, pp. 8-10)

The situation in early modern England was akin to a low-grade war, and it was not for nothing that its justice system seems to us so barbaric. The judges and juries were dealing with barbarians: gangs of young men who led a predatory lifestyle that made life miserable for people who ventured beyond the safety of their own homes.

Conclusion

We are still left with the original question: Were these criminals the most violent 1 to 2% or a random sample of a much larger proportion? In general, they behaved quite unlike most people, especially if they belonged to gangs, which seem to have been responsible for most homicides. It is hard to see how such people could correspond even to the most violent 16%—a range of individuals that begins one standard deviation to the right of the mean, at which point behavior just begins to seem "abnormal."

In all likelihood, execution removed individuals who were more than one standard deviation to the right of the mean, with a strong skew toward people more than two standard deviations to the right—in other words, something less than the most violent 16% with a strong skew toward the most violent 1%.

These assumptions differ from those of our model, which assumes that execution removed the most violent 1 to 2%. On the other hand, our model also assumes that each executed criminal would, in the absence of execution, have killed only one person over a normal lifetime. Clearly, many people among the executed were already serial murderers, not so much among the convicted murderers as among the convicted robbers. It is difficult to say whether the two sources of error would balance each other out, since we need more information on (1) just how abnormal the executed were in terms of behavior and (2) how many people they would have otherwise killed over a normal lifetime.

Executed criminals were probably a heterogeneous group. A quarter of them (mostly the thieves) would have likely killed 0 to 1 people on average if allowed to live out their lives. Another quarter may have averaged 1 to 2 murders. Finally, the remaining half may have had an ever higher score. Within this last group, we can be sure that a hard core of individuals would have each gone on to kill dozens of people, if they had not already done so.

Note

The other executed criminals were identified as 8 housebreakers, 7 forgers, 4 pirates, 2 incendiaries, 1 threatening letter writer, 1 ravisher, 1 thief-taker, and 1 releaser of prisoners. Wherever a single individual was charged with more than one crime, I classified him or her under the most serious offence, i.e., murder took precedence over robbery, and robbery took precedence over theft.

Of the 198 executed criminals, 10 were women. The book actually tells the life stories of 201 criminals, but three of them were not executed. I excluded the life stories in the appendix (7 murderers and 4 thieves) because they came from a much earlier time period and may have been less representative.

References

Frost, P. and H. Harpending. (2015). Western Europe, state formation, and genetic pacification, Evolutionary Psychology, 13, 230-243. http://www.epjournal.net/articles/western-europe-state-formation-and-genetic-pacification/  


Hayward, A.L. (2013[1735]). Lives of the Most Remarkable Criminals - who Have Been Condemned and Executed for Murder, the Highway, Housebreaking, Street Robberies, Coining Or Other Offences, Routledge.


Sharpe, J. (2010). Dick Turpin: The Myth of the English Highwayman, Profile Books. 

How many were already fathers?

$
0
0

Hanging outside Newgate Prison (Wikicommons)


In England, executions peaked between 1500 and 1750 at 1 to 2% of all men of each generation. Were there genetic consequences? Were propensities for violence being removed from the gene pool? Did the English population become kinder and gentler? Such is the argument I made in a recent paper with Henry Harpending.

In this column, I will address a second criticism made against this argument: Many executed criminals already had children, so execution came too late in their lives to change the makeup of the next generation.

Reproductive success

Hayward (2013) provides a sample of 198 criminals who were executed in the early 1700s. Of this total, only 32 (16%) had children at the time of execution, and 12 of them had one child each. Their reproductive success breaks down as follows:

Family size — # executed criminals (out of 198)

1 child  -    12
2 children - 3
3 children - 3
3-4 children - 1
5 children - 3
9 children - 1
"children" - 9

Although the above figures include illegitimate children, some executed criminals may have had offspring that they were unaware of or didn't wish to acknowledge. So we may be underestimating their reproductive success. But what were the chances of such children surviving to adulthood and reproducing? In pre-1840 England, 30% of all children were dead by the age of 15; in pre-1800 London, only 42% of all boys reached the age of 25 (Clark and Cummins, 2009). Chances of survival were undoubtedly even lower for children raised by single parents.

Here and there, we find references to high infantile mortality among the progeny of executed criminals. The coiner John Johnson regretted "the heavy misfortune he had brought upon himself and family, two of his children dying during the time of his imprisonment, and his wife and third child coming upon the parish." Prospects seemed better for childless widows, as noted in the life story of the thief Robert Perkins: "He said he died with less reluctance because his ruin involved nobody but himself, he leaving no children behind him, and his wife being young enough to get a living honestly" (Hayward, 2013).

Reproductive success was also curbed by marital instability. The footpad Joseph Ward was married for all of two days:

The very next morning after their wedding, Madam prevailed on him to slip on an old coat and take a walk by the house which she had shown him for her uncle's. He was no sooner out of doors, but she gave the sign to some of her accomplices, who in a quarter of an hour's time helped her to strip the lodging not only of all which belonged to Ward, but of some things of value that belonged to the people of the house. (Hayward, 2013)

In these life stories, the word "wife" is often qualified: "lived as wife,""whom he called his wife,""who passed for his wife,""he at that time owned for his wife," etc. Overall, only 40% of the executed criminals had been married: 38% of the men and 80% of the women.

Age structure

The age composition of the executed criminals suggests another reason for their low reproductive success. More than half were put to death before the age of 30. Since the mean age of first marriage for English men at that time was 27 (Wikipedia, 2015b), it's likely that most of these criminals were still trying to amass enough resources to get married and start a family.

Ages  — # executed criminals (out of 198)

10 - 19 years - 18
20 - 29 years - 88
30 - 39 years - 41
40 - 49 years - 20
50 - 59 years - 6
60 - 69 years - 0
70 + years - 1

Many criminals may have planned to steal enough money to give up crime and lead a straight life. Such plans came to nought for the thief John Little:

[...] the money which they amass by such unrighteous dealings never thrives with them; that though they thieve continually, they are, notwithstanding that, always in want, pressed on every side with fears and dangers, and never at liberty from the uneasy apprehensions of having incurred the displeasure of God, as well as run themselves into the punishments inflicted by the law. To these general terrors there was added, to Little, the distracting fears of a discovery from the rash and impetuous tempers of his associates, who were continually defrauding one another in their shares of the booty, and then quarrelling, fighting, threatening, and what not, till Little sometimes at the expense of his own allotment, reconciled and put them in humour. (Hayward, 2013)

Nonetheless, it is possible that others would have saved up a "nest egg," started a family, and moved on to a respectable life. Dick Turpin, for instance, was able to abandon highway robbery and pose as a horse trader. His ruse ultimately failed because he continued to run afoul of the law (Wikipedia, 2015a). The extent of this life strategy is difficult to measure because the existing information almost wholly concerns those criminals who were caught and executed.

Conclusion

Clearly, some of the executed criminals had already reproduced, but the overall reproductive success was very low, and probably lower still if we adjust for infantile mortality. Instead of arguing that executions had little impact on the gene pool because too many of the executed had already reproduced, one could argue the opposite: the genetic impact was inconsequential because so few would have reproduced anyway, even if allowed to live out their lives.

Reproductive success was highly variable in the criminal underclass. Many would have had few children with or without being sent to the gallows. But some would have done much better. At the age of 26, the highwayman William Miller already had two children by two wives, and many other women gravitated around him, even as he prepared for death: "Yet in the midst of these tokens of penitence and contrition several women came still about him." At the age of 25, the murderer Captain Stanley had fathered three or four children by one woman and was looking for a new wife. One might also wonder about some of the executed teenagers. At the age of 19, the footpad Richard Whittingham was already married, though still childless, and the thief William Bourne likewise at the age of 18.

In an earlier England, such young men would have done well reproductively, as leaders of warrior bands. But that England no longer existed, and criminal gangs offered the only outlet for engaging in plunder, violence, and debauchery with other young men.

References

Clark, G. and N. Cummins. (2009). Disease and Development: Historical and Contemporary Perspectives. Urbanization, Mortality, and Fertility in Malthusian England, American Economic Review: Papers & Proceedings, 99,2, 242-247
http://neilcummins.com/Papers/AER_2009.pdf

Frost, P. and H. Harpending. (2015). Western Europe, state formation, and genetic pacification, Evolutionary Psychology, 13, 230-243. http://www.epjournal.net/articles/western-europe-state-formation-and-genetic-pacification/  

Hayward, A.L. (2013[1735]). Lives of the Most Remarkable Criminals - who Have Been Condemned and Executed for Murder, the Highway, Housebreaking, Street Robberies, Coining Or Other Offences, Routledge.
http://www.gutenberg.org/files/13097/13097-h/13097-h.htm

Wikipedia. (2015a). Dick Turpin
http://en.wikipedia.org/wiki/Dick_Turpin

Wikipedia (2015b). Western European Marriage Pattern

http://en.wikipedia.org/wiki/Western_European_marriage_pattern

The hidden past of Claude Lévi-Strauss

$
0
0



Claude Lévi-Strauss, 1973 (Wikicommons)


The anthropologist Claude Lévi-Strauss died six years ago, leaving behind a treasure trove of correspondence and unpublished writings. We can now trace where his ideas came from and how they evolved.

I admired Lévi-Strauss during my time as an anthropology student because he asked questions that Marxist anthropologists would never ask. That's why I preferred to call myself a Marxisant, and not a full-blown Marxist. I especially admired him for addressing the issue of nature versus nurture, which had once been a leading issue in anthropology but was now studiously ignored. Only he, it seemed, could defy this omertà and not suffer any ill effects, perhaps because of his age and status.

In his best known tome, The Elementary Structures of Kinship, this issue dominated the first chapter:

Man is both a biological being and a social individual. Among his responses to external or internal stimuli, some are wholly dependent upon his nature, others upon his social environment.

Lévi-Strauss admitted that the two were not always easy to separate:

Culture is not merely juxtaposed to [biological] life nor superimposed upon it, but in one way serves as a substitute for life, and in the other, uses and transforms it, to bring about the synthesis of a new order.
 
He reviewed the different ways of disentangling one from the other:

The simplest method would be to isolate a new-born child and to observe its reactions to various stimuli during the first hours or days after birth. Responses made under such conditions could then be supposed to be of a psycho-biological origin, and to be independent of ulterior cultural syntheses.

[Nonetheless,] the question always remains open whether a certain reaction is absent because its origin is cultural, or because, with the earliness of the observation, the physiological mechanisms governing its appearances are not yet developed. Because a very young child does not walk, it cannot be concluded that training is necessary, since it is known that a child spontaneously begins to walk as soon as it is organically capable of doing so.

His interest in the interactions between culture and biology went further. The gene pool of a population will influence its culture, which in turn will alter the gene pool:

The selection pressure of culture—the fact that it favors certain types of individuals rather than others through its forms of organization, its ideas of morality, and its aesthetic values—can do infinitely more to alter a gene pool than the gene pool can do to shape a culture, all the more so because a culture's rate of change can certainly be much faster than the phenomena of genetic drift. (Lévi-Strauss, 1979, p. 24-25)

This is of course gene-culture co-evolution. He may have given the idea to L.L. Cavalli-Sforza, who first began to propound it while teaching a cultural evolution class in 1978-1979. Two of his students, Robert Boyd and Peter Richerson, went on to popularize the idea in their book Culture and the Evolutionary Process (1985) (Stone and Lurquin 2005, p. 108). Lévi-Strauss had in fact mentioned the same idea long before in a UNESCO lecture:

When cultures specialize, they consolidate and favor other traits, like resistance to cold or heat for societies that have willingly or unwillingly had to adapt to extreme climates, like dispositions to aggressiveness or contemplation, like technical ingenuity, and so on. In the form these traits appear to us on the cultural level, none can be clearly linked to a genetic basis, but we cannot exclude that they are sometimes linked partially and distantly via intermediate linkages. In this case, it would be true to say that each culture selects for genetic aptitudes that, via a feedback loop, influence the culture that had initially helped to strengthen them. (Lévi-Strauss, 1971)

In the same lecture, he made another point:

[Humanity] will have to relearn that all true creation implies some deafness to the call of other values, which may go so far as to reject or even negate them. One cannot at the same time melt away in the enjoyment of the Other, identify oneself with the Other, and keep oneself different. If fully successful, complete communication with the Other will doom its creative originality and my own in more or less short time. The great creative ages were those when communication had increased to the point that distant partners stimulated each other but not so often and rapidly that the indispensable obstacles between individuals, and likewise between groups, dwindled to the point that excessively easy exchanges would equalize and blend away their diversity. (Lévi-Strauss, 1971)

His audience was taken aback, according to fellow anthropologist Wiktor Stoczkowski:

These words shocked the listeners. One can easily imagine how disconcerted UNESCO employees were, who, meeting Lévi-Strauss in the corridor after the lecture, expressed their disappointment at hearing the institutional articles of faith to which they thought they had the merit of adhering called into question. René Maheu, the Director General of UNESCO, who had invited Lévi-Strauss to give this lecture, seemed upset. (Stoczkowski, 2008; Frost, 2014)

Where his ideas came from

Since his death in 2009, we have gained a clearer picture of his intellectual evolution. His published writings had already provided an answer:

When I was about sixteen, I was introduced to Marxism by a young Belgian socialist, whom I had got to know on holiday, and who is now one of his country's ambassadors abroad. I was all the more delighted by Marx in that the reading of the works of the great thinker brought me into contact for the first time with the line of philosophical development running from Kant to Hegel; a whole new world was opened up to me. Since then, my admiration for Marx has remained constant [...] (Lévi-Strauss, 2012 [1973])

Looking through Lévi-Strauss' published and unpublished writings, Wiktor Stoczkowski tried to learn more about this episode but found nothing:

It suffices however to look closely at the milieus that Lévi-Strauss frequented in the 1920s and 1930s, or to reread the articles he published during that period to realize that his references to Marx were at that time astonishingly rare, in flagrant contradiction with his declarations […] In contrast, another name often came up during that time in the writings of the young Lévi-Strauss: that of Henri De Man. And that name, curiously, Lévi-Strauss would never mention after the war. (Stoczkowski, 2013)

As a young leftist disenchanted with Marxism, Lévi-Strauss was especially fascinated by De Man's book Au-delà du marxisme (Beyond Marxism), published in 1927. One of his friends invited De Man to Paris to present his ideas to French socialists. Lévi-Strauss was given the job of organizing the lecture and wrote to De Man about the difficulties encountered:

We have run into many difficulties, which we scarcely suspected and which have sadly shed light on the conservative and sectarian spirit of a good part of French socialism [...]. We thought that the best means to give this [lecture] all of the desirable magnitude would be to make it public [...] [but] to obtain the key support of the Socialist Students, we have agreed to make your lecture non-public, and to reserve admission to members of socialist organizations. Thus, we have learned that Marxism is a sacrosanct doctrine in our party, and that to study theories that stray from it, we have to shut ourselves in very strongly, so that no one on the outside will know (Stoczkowski, 2013)

The lecture was held the next year. Stoczkowski describes the letter that Lévi-Strauss wrote to the invitee afterwards:

"Thanks to you," he wrote, "socialist doctrines have finally emerged from their long sleep; the Party is undergoing, thanks to you, a revival of intellectual activity ...." But there is more. Speaking on his behalf and on behalf of his young comrades, Lévi-Strauss informed De Man that his book Au-delà du marxisme had been for them "a genuine revelation..." Speaking personally, Lévi-Strauss added that he was "profoundly grateful" to De Man's teachings for having "helped me get out of an impasse I believed to have no way out."(Stoczkowski, 2013)

Nothing indicates that Lévi-Strauss had ever been a Marxist in his youth. Both he and his friends saw it as a pseudo-religion that stunted the development of socialism.

But who was Henri De Man?

He was a Belgian Marxist who had lived in Leipzig, Germany, where he became the editor of a radical socialist journal, Leipziger Volkszeitung, that ran contributions by Rosa Luxembourg, Pannekoek, Radek, Trotsky, Karl Liebknecht, and others. In 1907, he helped found the Socialist Youth International. He later returned to Belgium and enrolled when war broke out, seeing the Allied side as a progressive alternative to German authoritarianism.

His views changed during the 1920s, while teaching at the University of Frankfurt. He came to feel that Marxists erred in seeing themselves as an antithesis to the current system; such a perspective made them oppose all traditional values, particularly Christianity and national identity. He now argued that laws, morality, and religion are not bourgeois prejudices, but rather things that are necessary to make any society work. Marxists also erred, he felt, in their narrow focus on economic determinism and their disregard for psychology and the will to act. Although De Man acknowledged the self-destructive tendencies of capitalism, these tendencies do not inevitably lead to revolution. Rather, revolution will happen only when enough people realize that current conditions are neither tolerable nor inevitable. Above all, revolution cannot happen unless it respects existing cultural, religious, and national traditions:

If one sees in socialism something other than and more than an antithesis to modern capitalism, and if one relates it to its moral and intellectual roots, one will find that these roots are the same as those of our entire Western civilization. Christianity, democracy, and socialism are now, even historically, merely three forms of one idea.

De Man returned to Belgium during the 1930s, becoming vice-president and then president of the Belgian Labour Party. In 1935, with the formation of a government of national unity to fight the Great Depression, he was made minister of public works and job creation. In this role, he pushed for State planning and looked to Germany and Italy as examples to be followed. He became increasingly disillusioned with parliamentary democracy and began to call for an “authoritarian democracy” where decisions would be made primarily through the legislature and referendums, rather than through the executive and party politics (Tremblay, 2006).

When Germany overran Belgium in 1940, De Man issued a manifesto to Labour Party members and advised them to collaborate: "For the working classes and for socialism, this collapse of a decrepit world, far from being a disaster, is a deliverance" (Wikipedia, 2015). Over the next year, he served as de factoprime minister before falling into disfavor with the German authorities. He spent the rest of the war in Paris and then fled to Switzerland where he lived his final years. Meanwhile, a Belgian court convicted him in absentia of treason.

Conclusion

Like many people after the war, Claude Lévi-Strauss had to invent a new past. It didn't matter that he had admired Henri de Man at a time when the Belgian socialist was not yet a fascist or a collaborator. As Stoczkowski notes, guilt by association would have been enough to ruin his academic career. Ironically, if he had really been a loyal Marxist during the late 1920s and early 1930s, he would also have denied back then the crimes being committed in the name of Marxism: the Ukrainian famine, Stalin's purges ... Yet, for that, he never faced any criticism.

References

De Man. (1927). Au-delà du marxisme, Brussels, L'Églantine.

Frost, P. (2014). Negotiating the gap. Four academics and the dilemma of human biodiversity, Open Behavioral Genetics, June 20.
http://openpsych.net/OBG/2014/06/negotiating-the-gap/ 

Lévis-Strauss, C. (1969 [1949]). The Elementary Structures of Kinship, Beacon Press.

Lévi-Strauss, C. (1971). Race et culture, conférence de Lévi-Strauss à L'UNESCO le 22 mars 1971

Lévi-Strauss, C. (2012[1973]). Tristes Tropiques, New York: Penguin

Lévi-Strauss, C. (1985). Claude Lévi-Strauss à l'université Laval, Québec (septembre 1979), prepared by Yvan Simonis, Documents de recherche no. 4, Laboratoire de recherches anthropologiques, Département d'anthropologie, Faculté des Sciences sociales, Université Laval.

Stoczkowski, W. (2008). Claude Lévi-Strauss and UNESCO, The UNESCO Courrier, no. 5, pp. 5-8.

Stoczkowski, W. (2013). Un étrange socialisme de Claude-Lévi-Strauss / A weird socialism of Claude Lévi-Strauss, Europe91, n° 1005-1006, 37-53.

Stone, L. and P.F. Lurquin. (2005). A Genetic and Cultural Odyssey. The Life and Work of L. Luca Cavalli-Sforza. New York: Columbia University Press.

Tremblay, J-M. (2006). Henri de Man, 1885-1953, Les classiques des sciences sociales, UQAC

Wikipedia. (2015). Henri de Man


More on the younger Franz Boas

$
0
0

As a professor at Columbia, Franz Boas encountered the elite liberal culture of the American Northeast, one example being Mary White Ovington, a founder of the NAACP (Wikicommons)


Antiracism has roots that go back to early Christianity and the assimilationist Roman and Hellenistic empires. In its modern form, however, it is a much more recent development, particularly in its special focus on relations between whites and blacks and its emphasis on discrimination as the cause of any mental or behavioral differences.

Modern antiracism began in the early 1800s as a radical outgrowth of abolitionism, reaching high levels of popular support in the mid-1800s, particularly in the American Northeast, and then falling into decline due to growing interest in Social Darwinism and increasing disillusionment with the aftermath of the Civil War. By the 1920s, it really held sway only in the Northeast, and even there it was losing ground.

This situation changed dramatically in the 1930s. Antiracism revived and entered a period of growth that would eventually go global. The anthropologist Franz Boas played a key role through his own work and indirectly through the work of his two protégés: Margaret Mead and Ruth Benedict.

Yet this was the old Boas, a man already in his seventies. The younger Boas had thought differently, as seen in an 1894 speech he gave on "Human Faculty as Determined by Race":

We find that the face of the negro as compared to the skull is larger than that of the American [Indian], whose face is in turn larger than that of the white. The lower portion of the face assumes larger dimensions. The alveolar arch is pushed forward and thus gains an appearance which reminds us of the higher apes. There is no denying that this feature is a most constant character of the black races and that it represents a type slightly nearer the animal than the European type. [...] We find here at least a few indications which tend to show that the white race differs more from the higher apes than the negro. But does this anatomical difference prove that their mental capacity is lower than that of the white? The probability that this may be the case is suggested by the anatomical facts, but they by themselves are no proof that such is the case. (Boas, 1974, p. 230)

It does not seem probable that the minds of races which show variations in their anatomical structure should act in exactly the same manner. Differences of structure must be accompanied by differences of function, physiological as well as psychological; and, as we found clear evidence of difference in structure between the races, so we must anticipate that differences in mental characteristics will be found. (Boas, 1974, p. 239)

We have shown that the anatomical evidence is such, that we may expect to find the races not equally gifted. While we have no right to consider one more ape-like than the other, the differences are such that some have probably greater mental vigor than others. The variations are, however, such that we may expect many individuals of all races to be equally gifted, while the number of men and women of higher ability will differ. (Boas, 1974, p. 242)

Boas returned to this topic in a 1908 speech on "Race Problems in America":

I do not believe that the negro is, in his physical and mental make-up, the same as the European. The anatomical differences are so great that corresponding mental differences are plausible. There may exist differences in character and in the direction of specific aptitudes. There is, however, no proof whatever that these differences signify any appreciable degree of inferiority of the negro, notwithstanding the slightly inferior size, and perhaps lesser complexity of structure, of his brain; for these racial differences are much less than the range of variation found in either race considered by itself.(Boas, 1974, pp. 328-329)

How did his views on race evolve over the next twenty years? This evolution is described by Williams (1996), who sees his views beginning to change at the turn of the century. After getting tenure at Columbia University in 1899, he became immersed in the elite liberal culture of the American northeast and began to express his views on race accordingly. The onset of this change is visible in 1905, when he penned an article for the first issue of The Crisis, the organ of the NAACP: “The Negro and the Demands of Modern Life.” While pointing out that the average negro brain was "smaller than that of other races" and that it was "plausible that certain differences of form of brain exist," he cautioned:

We must remember that individually the correlation [...] is often overshadowed by other causes, and that we find a considerable number of great men with slight brain weight. [...] We may, therefore, expect less average ability and also, on account of probable anatomical differences, somewhat different mental tendencies. (Williams, 1996, p. 17)

The same year, he wrote to a colleague, stressing "the desirability of collecting more definite information in relation to certain traits of the Negro race that seem of fundamental importance in determining the policy to be pursued towards that race" (Williams, 1996, p. 18). In 1906, he sought funding for such a project with two specific goals:

(1) Is there an earlier arrest of mental and physical development in the Negro child, as compared with the white child? And, if so, is this arrest due to social causes or to anatomical and physiological conditions?

(2) What is the position of the mulatto child and of the adult mulatto in relation to the two races? Is he an intermediate type, or is there a tendency of reversion towards either race? So that particularly gifted mulattoes have to be considered as reversals of the white race. The question of the physical vigor of the mulatto could be taken up at the same time. (Williams, 1996, p. 19)

His tone was less even-handed in a private letter, written the same year:

You may be aware that in my opinion the assumption seems justifiable that on the average the mental capacity of the negro may be a little less than that of the white, but that the capacities of the bulk of both races are on the same level.(Williams, 1996, p. 19)

In 1911, Boas published the first edition of The Mind of Primitive Man. It recycled most of his previous writings on race, while emphasizing that race differences in mental makeup were statistical and showed considerable overlap. In 1915, he continued in this direction when he wrote a preface to Half A Man by Mary White Ovington, one of the founders of the NAACP:

Many students of anthropology recognize that no proof can be given of any material inferiority of the Negro race; that without doubt the bulk of the individuals composing the race are equal in mental aptitude to the bulk of our own people; that, although their hereditary aptitude may lie in slightly different directions, it is very improbable that the majority of individuals composing the white race should possess greater ability than the Negro race. (Williams, 1996, pp. 22-23)

Nonetheless, one finds little change from his earlier writings in his 1928 work Anthropology and Modern Life:

[...] the distribution of individuals and of family lines in the various races differs. When we select among the Europeans a group with large brains, their frequency will be relatively high, while among the Negroes the frequency of occurrence of the corresponding group will be low. If, for instance, there are 50 percent of a European population who have a brain weight of more than, let us say 1,500 grams, there may be only 20 percent of Negroes of the same class. Therefore, 30 percent of the large-brained Europeans cannot be matched by any corresponding group of Negroes. (Williams, 1996, p. 35)

Conclusion

From 1900 to 1930, Boas seemed to become increasingly liberal in his views on race, but this trend was hesitant at best and reflected, at least in part, a change in the audience he was addressing. As a professor at Columbia, he was dealing with a regional WASP culture that still preserved the radical abolitionism of the previous century. A good example was Mary White Ovington, whose Unitarian parents had been involved in the anti-slavery movement and who in 1910 helped found the NAACP. Boas was also dealing with the city's growing African American community and, through Ovington's contacts, wrote articles for the NAACP. Finally, he was also dealing with the growing Jewish community, who identified with antiracism partly out of self-interest and partly out of a desire to assimilate into northeastern WASP culture.

Boas didn't really change his mind on race until the 1930s. The cause is not hard to pinpoint. When he died in 1942, an obituary mentioned his alarm over the threat of Nazism:

Dr. Boas, who had studied and written widely in all fields of anthropology devoted most of his researches during the past few years to the study of the "race question," especially so after the rise of the Nazis in Germany. Discussing his efforts to disprove what he called "this Nordic nonsense," Prof. Boas said upon his retirement from teaching in 1936 that "with the present condition of the world, I consider the race question a most important one. I will try to clean up some of the nonsense that is being spread about race those days. I think the question is particularly important for this country, too; as here also people are going crazy." (JTA, 1942)

Hitler's rise to power created a sense of urgency among many academics, both Jewish and non-Jewish, thereby convincing fence-sitters like Franz Boas to put aside their doubts and take a more aggressive stand on race. Thus began the war on racism, which foreshadowed the coming world conflict.

References

Boas, F. (1974). A Franz Boas Reader. The Shaping of American Anthropology, 1883-1911, G.W. Stocking Jr. (ed.), Chicago: The University of Chicago Press.

Frost, P. (2014). The Franz Boas you never knew, Evo and Proud, July 13  http://evoandproud.blogspot.ca/2014/07/the-franz-boas-you-never-knew.html

JTA (1942). Dr. Franz Boas, Debunker of Nazi Racial Theories, Dies in New York, December 23  http://www.jta.org/1942/12/23/archive/dr-franz-boas-debunker-of-nazi-racial-theories-dies-in-new-york


Williams Jr., V.J. (1996). Rethinking Race: Franz Boas and His Contemporaries, University Press of Kentucky.  https://books.google.ca/books?id=MKnIOfHNxXMC&printsec=frontcover&dq=Rethinking+race+franz+boas&hl=fr&sa=X&ei=lTkcVcLqLs-OyATM-IGoCQ&ved=0CCAQ6AEwAA#v=onepage&q=Rethinking%20race%20franz%20boas&f=false

Impressions of Russia

$
0
0

 
The Battle for Sevastopol, now showing in Russian theatres
 
The young man shook his head. “No, I can’t say I’m pro-Putin. There’s too much corruption in Russia, with too much money going to the wrong people. We should become more Western. Instead, we’re moving in the other direction.”

Finally, I thought, a liberal critic of Putin. The young man continued. “Here it’s not too bad, but in Moscow you can see the change. They’re all over. Please, don’t get me wrong, I don’t hate anyone, but I feel uncomfortable when there are so many of them. Sometimes, I wonder whether I’m still in Russia.”

*******************************************************

Much had changed since my last visit ten years ago. Driving into the city of Voronezh from the airport, I could see entirely new neighborhoods, supermarkets, office buildings, and the like. In 2003, there was only one shopping mall in the whole city, and it was nothing special. Now, there were malls as huge as any in Toronto. Things had likewise improved for some of our old friends and acquaintances. A few had moved up into the growing middle class, including one couple who showed us their new palatial home on the outskirts.

Yet the bulk of the population seemed no better off, and in some ways worse off. Ten years ago, jobs were there for the taking. The pay may have been lousy, but it was money. Now, the competition is intense even for those jobs. An unemployed man told me: “It’s hard to find work now. Employers will hire immigrants because they work for much less and won’t complain. And there are a lot of them now, mainly from Central Asia, but also from places all over.”

Sour grapes? Perhaps. But it’s consistent with what a Quebec building contractor had told me earlier. “I no longer bother with Russian construction projects because there’s always a Russian company that will put in an absurdly low bid. The only way he can stay within budget is by hiring illegal immigrants. Everyone knows it, but nothing is ever done to stop it.”

********************************************************

I wasn’t surprised to see Ukrainian refugees in a big city like Voronezh, but it was surprising to see so many in remote farming villages. And each refugee family had a horror story to tell. It’s one thing to hear these stories from professional journalists; it’s another to hear them from ordinary people who aren’t being paid to say what they say. This is an underappreciated factor in the growing anger among Russians against the Ukrainian government.

After all that’s happened, I don’t see how eastern Ukraine will ever accept being ruled by Kiev. It’s like a marriage that has crossed the line between verbal abuse and physical violence.

*********************************************************

We were standing outside a fast food kiosk. “I just don’t get it,” said my wife. “Prices are almost as high here as in Canada, yet the wages are a lot lower. How do people manage to survive?”

A young man overheard her. “The people who don’t survive are the ones you don’t get to see.”

**********************************************************

Postwar housing projects cover most of the city. They are now aging badly, and North Americans wouldn’t hesitate to call them “slums.” We like to think that slums cause crime, broken homes, and stunted mental development. Yet, here, you can walk up about in safety, families are usually intact, and the children are studying hard to become engineers, scientists, ballet dancers, or what have you.

**********************************************************

We were sitting in a restaurant with two young Russians, a lawyer and a university teacher. “Will there be war?” said one, looking worried. I tried to be reassuring, saying no one wanted war. But I wasn’t sure myself.

There was another question. “But do the Americans know what they’re getting into?” I shook my head. Few people in the West know much about Russia, and what little they do is worse than useless.

***********************************************************

Hitler said it would be like kicking in the door of a rotten building. That’s how it seemed at first. And then the war dragged on and on, grinding down one German division after another. If—God forbid—war happens another time, we’ll probably see the same pattern. Without a higher purpose, the average Russian man often retreats into indolence, alcoholism, and self-destructive behavior. Give him that purpose, and he will fight for it with almost superhuman power.

One of my professors ascribed it to the yearly cycle of traditional farm life. For most of the year, the muzhik slept a lot and whiled away his days in aimlessness. But when it came time to plough the fields or bring in the harvest, he had to pull out all stops and work continuously from dawn to dusk.

************************************************************

It’s the 70th anniversary of victory in the Great Patriotic War, and reminders can be seen everywhere. There has been a spate of new war movies, including one about the Battle for Sevastopol. It’s hard not to see references to the current conflict.

Behaviorism and the revival of antiracism

$
0
0

John B. Watson conditioning a child to fear Santa Claus. With a properly controlled environment, he felt that children can be conditioned to think and behave in any way desired

 

After peaking in the mid-19th century, antiracism fell into decline in the U.S., remaining dominant only in the Northeast. By the 1930s, however, it was clearly reviving, largely through the efforts of the anthropologist Franz Boas and his students.

But a timid revival had already begun during the previous two decades. In the political arena, the NAACP had been founded in 1910 under the aegis of WASP and, later, Jewish benefactors. In academia, the 1920s saw a growing belief in the plasticity of human nature, largely through the behaviorist school of psychology.

The founder of behaviorism was an unlikely antiracist. A white southerner who had been twice arrested in high school for fighting with African American boys, John B. Watson (1878-1958) initially held a balanced view on the relative importance of nature vs. nature. His book Psychology from the Standpoint of a Behaviorist (1919) contained two chapters on "unlearned behavior". The first chapter is summarized as follows:

In this chapter, we examine man as a reacting organism, and specifically some of the reactions which belong to his hereditary equipment. Human action as a whole can be divided into hereditary modes of response (emotional and instinctive), and acquired modes of response (habit). Each of these two broad divisions is capable of many subdivisions. It is obvious both from the standpoint of common-sense and of laboratory experimentation that the hereditary and acquired forms of activity begin to overlap early in life. Emotional reactions become wholly separated from the stimuli that originally called them out (transfer), and the instinctive positive reaction tendencies displayed by the child soon become overlaid with the organized habits of the adult.

By the mid-1920s, however, he had largely abandoned this balanced view and embraced a much more radical environmentalism, as seen in Behaviorism (1924):

Our conclusion, then, is that we have no real evidence of the inheritance of traits. I would feel perfectly confident in the ultimately favorable outcome of a healthy, well-formed baby born of a long line of crooks, murderers and thieves, and prostitutes (Watson, 1924, p. 82)

[...] Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select—doctor, lawyer, artist, merchant-chief, and yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. I am going beyond my facts and I admit it, but so have the advocates of the contrary and they have been doing it for many thousands of years. (Watson, 1924, p. 82)

Everything we have been in the habit of calling "instinct" today is a result largely of training—belongs to man's learned behavior. As a corollary from this I wish to draw the conclusion that there is no such thing as an inheritance of capacity, talent, temperament, mental constitution, and characteristics. These things again depend on training that goes on mainly in the cradle. (Watson,1924, p. 74).

Why the shift to extreme environmentalism? It was not a product of ongoing academic research. In fact, Watson was no longer in academia, having lost his position in 1920 at Johns Hopkins University after an affair with a graduate student. At the age of 42, he had to start a new career as an executive at a New York advertising agency. Some writers attribute this ideological shift to his move from academia to advertising:

Todd (1994) noted that after Watson lost his academic post at Johns Hopkins, he abandoned scientific restraint in favor of significantly increased stridency and extremism, such that there were "two Watsons—a pre-1920, academic Watson and a post-1920, postacademic Watson" (p. 167). Logue (1994) argued that Watson's shift from an even-handed consideration of heredity and environment to a position of bombast and extreme environmentalism was motivated by the need to make money and the desire to stay in the limelight after he left academia. (Rakos, 2013)

There was another reason: the acrimonious debate in the mid-1920s over immigration, particularly over whether the United States was receiving immigrants of dubious quality. Rakos (2013) points to Watson's increasingly harsh words on eugenics and the political background: "It is probably no coincidence that only in the 1924 edition of the book—published in the same year that Congress passed the restrictive Johnson-Lodge Immigration Act—did Watson express his belief that behaviorism can promote social harmony in a world being transformed by industrialization and the movement of peoples across the globe."  

Eugenics is mentioned, negatively, in his 1924 book:

But you say: "Is there nothing in heredity-is there nothing in eugenics-[...] has there been no progress in human evolution?" Let us examine a few of the questions you are now bursting to utter.

Certainly black parents will bear black children [...]. Certainly the yellow-skinned Chinese parents will bear a yellow skinned offspring. Certainly Caucasian parents will bear white children. But these differences are relatively slight. They are due among other things to differences in the amount and kind of pigments in the skin. I defy anyone to take these infants at birth, study their behavior, and mark off differences in behavior that will characterize white from black and white or black from yellow. There will be differences in behavior but the burden of proof is upon the individual be he biologist or eugenicist who claims that these racial differences are greater than the individual differences. (Watson, 1924, p. 76)

You will probably say that I am flying in the face of the known facts of eugenics and experimental evolution—that the geneticists have proven that many of the behavior characteristics of the parents are handed down to the offspring—they will cite mathematical ability, musical ability, and many, many other types. My reply is that the geneticists are working under the banner of the old "faculty" psychology. One need not give very much weight to any of their present conclusions. (Watson, 1924, p. 79) 


Conclusion

Antiracism did not revive during the interwar years because of new data. Watson's shift to radical environmentalism took place a half-decade after his departure from academia. It was as an advertising executive, and as a crusader against the 1924 Immigration Act, that he entered the "environmentalist" phase of his life. This phase, though poor in actual research, was rich in countless newspaper and magazine articles that would spread his behaviorist gospel to a mass audience.

The same could be said for Franz Boas. He, too, made his shift to radical antiracism when he was already semi-retired and well into his 70s. Although this phase of his life produced very little research, it saw the publication of many books and articles for the general public. As with Watson, the influence of external political events was decisive, specifically the rise of Nazism in the early 1930s.

In both cases, biographers have tried to explain this ideological shift by projecting it backward in time to earlier research. Boas' antiracism is often ascribed to an early study that purported to show differences in cranial form between European immigrants and their children (Boas, 1912). Yet Boas himself was reluctant to draw any conclusions at the time, merely saying we should "await further evidence before committing ourselves to theories that cannot be proven." Later reanalysis found no change in skull shape once age had been taken into account (Fergus, 2003). More to the point, Boas continued over the next two decades to cite differences in skull size as evidence for black-white differences in mental makeup (Frost, 2015).

Watson's radical environmentalism has likewise been explained by his Little Albert Experiment in 1920, an attempt to condition a fear response in an 11-month-old child. Aside from the small sample size (one child) and the lack of any replication, it is difficult to see how this finding could justify his later sweeping pronouncements on environmentalism. There were admittedly other experiments, but they came to an abrupt end with his dismissal from Johns Hopkins, and little is known about their long-term effects:

Watson tested his theories on how to condition children to express fear, love, or rage—emotions Watson conjectured were the basic elements of human nature. Among other techniques, he dropped (and caught) infants to generate fear and suggested that stimulation of the genital area would create feelings of love. In another chilling project, Watson boasted to Goodnow in summer 1920 that the National Research Council had approved a children's hospital he proposed that would include rooms for his infant psychology experiments. He planned to spend weekends working at the "Washington infant laboratory." (Simpson,2000) 

Watson did apply behaviorism to the upbringing of his own children. The results were disappointing. His first marriage produced a daughter who made multiple suicide attempts and a son who sponged off his father. His second marriage produced two sons, one of whom committed suicide (Anon, 2005). His granddaughter similarly suffered from her behaviorist upbringing and denounced it in her memoir Breaking the Silence. Towards the end of his life Watson regretted much of his child-rearing advice (Simpson, 2000).


References 

Anon (2005). The long dark night of behaviorism, Psych 101 Revisited, September 6
http://robothink.blogspot.ca/2005/09/long-dark-night-of-behaviorism.html

Boas, F. (1912). Changes in the Bodily Form of Descendants of Immigrants, American Anthropologist, 14, 530-562. 

Fergus, C. (2003). Boas, Bones, and Race, May 4, Penn State News
http://news.psu.edu/story/140739/2003/05/01/research/boas-bones-and-race 

Frost, P. (2015). More on the younger Franz Boas, Evo and Proud, April 18
http://www.evoandproud.blogspot.ca/2015/04/more-on-younger-franz-boas.html 

Rakos, R.F. (2013). John B. Watson's 1913 "Behaviorist Manifesto: Setting the stage for behaviorism's social action legacy, Revista Mexicana de analisis de la conducta, 39(2)
http://rmac-mx.org/john-b-watsons-1913-behaviorist-manifestosetting-the-stage-for-behaviorisms-social-action-legacy/  
 
Simpson, J.C. (2000). It's All in the Upbringing, John Hopkins Magazine, April
http://pages.jh.edu/~jhumag/0400web/35.html 

Watson, J.B. (1919). Psychology from the Standpoint of a Behaviorist,
http://psycnet.apa.org/psycinfo/2009-03123-000/  

Watson, J. B. (1924). Behaviorism. New York: People's Institute.
http://books.google.ca/books?hl=fr&lr=&id=PhnCSSy0UWQC&oi=fnd&pg=PR10&dq=behaviorism+watson&ots=tW26oNvzjs&sig=YtDpYTYq3hE80QHJfo1Q4ebsuPI#v=onepage&q=behaviorism%20watson&f=false

Age of reason

$
0
0

Rally in Sydney (Wikicommons). Antiracists see themselves as open-minded individuals at war with hardline ideologues.

 

The interwar years gave antiracism a new lease on life, thus reversing a long decline that had begun in the late 19th century. This reversal was driven largely by two events: the acrimonious debate over U.S. immigration in the mid-1920s and Hitler's rise to power in the early 1930s. Many people, especially academics, were convinced of the need for an uncompromising war on "racism"—a word just entering use as a synonym for Nazism.

The war on racism began in the social sciences, especially through the efforts of John B. Watson in psychology and the Boasian triad in anthropology (Franz Boas, Ruth Benedict, Margaret Mead). After initially holding a more balanced view, these social scientists began to argue that genes contribute little to differences in behavior and mental makeup, especially between human populations.

In addition to the political context, there was also the broader cultural setting. The 1920s brought a flowering of African and African-American influences on popular culture, as seen in the Harlem Renaissance, the emergence of jazz, and the infatuation with art nègre. African Americans were viewed no longer as an embarrassment but as a source of excitement and novelty. In this role, black singers, musicians, and artists would lead the way in mobilizing mainstream support for the war on racism, such as Marian Anderson in her concert at the Lincoln Memorial and Paul Robeson through his political activism.

Would things have turned out differently if the immigration debate of the 1920s had been less acrimonious or if Hitler had not come to power? The most widespread answer seems to be "no"—sooner or later, men and women of reason would have broken free of the ideological straightjacket imposed by racism, social Darwinism, and hereditarianism. Franz Boas said as much in an interview he gave in 1936: "I will try to clean up some of the nonsense that is being spread about race those days. I think the question is particularly important for this country, too; as here also people are going crazy" (JTA, 1942).

How true is this view? Was the war on racism a healthy reaction to a mad ideology?

First, the word "racism" scarcely existed in its current sense back then. Continuous use dates from the 1920s and initially referred to the integral "blood and soil" nationalism that was becoming popular, especially in Germany, the word "racist" itself being perhaps a translation of the German Völkisch. Its use in a broader sense is largely postwar and has rarely been positive or even neutral. It's an insult. The racist must be re-educated and, if necessary, eliminated.

If the racist is no longer an ignorant person but rather a villain, and if he is defined by his impulses or negative passions (hate, aggressive intolerance, etc.), then the evil is in him, and his case seems hopeless. The antiracist's task is no longer to lead the "racist" towards goodness, but rather to isolate him as a carrier of evil. The "racist" must be singled out and stigmatized. (Taguieff, 2013)

The term "social Darwinism" likewise came into use well after the period when it was supposedly dominant:

Bannister (1988) and Bellomy (1984) established that "social Darwinism" was all but unknown to English-speaking readers before the Progressive Era. Hodgson's (2004) bibliometric analysis identified a mere eleven instances of "social Darwinism" in the Anglophone literature (as represented by the JSTOR database) before 1916. Before 1916 "social Darwinism" had almost no currency whatsoever [...].

"Social Darwinism" did not acquire much greater currency between 1916 and 1943; a mere 49 articles and reviews employ the term. (Leonard, 2009)

The term did not become commonplace until 1944 with the publication of Social Darwinism in American Thought by Richard Hofstadter. Since then it has appeared 4,258 times in the academic literature. Like "racism" it has seldom been used positively or neutrally:

"Social Darwinism" had always been an epithet. From its very beginnings, reminds Bellomy (1984, p. 2), "social Darwinism" has been "heavily polemical, reserved for ideas with which a writer disagreed."(Leonard, 2009).

The term "hereditarianism" likewise entered common use long after its supposed golden age. According to Google Scholar, "hereditarian" and "hereditarianism" appear 0 times in the literature between 1890 and 1900, 6 times between 1900 and 1910, 8 times between 1910 and 1920, 18 times between 1920 and 1930, and 52 times between 1930 and 1940. In most cases, these terms seem to have been used pejoratively.

Thus, all three words entered common use when the beliefs they described were no longer dominant. More to the point, these words were more often used by opponents than by proponents, sometimes exclusively so.

Of course, an ideology doesn't need a name to exist. Many people engaged in racial thinking without bothering to label it. As Barkan (1992, p. xi) observes: “Prior to that time [the interwar years] social differentiation based upon real or assumed racial distinctions was thought to be part of the natural order.” It is difficult, however, to describe such thinking as an ideology, in the sense of a belief-system that seeks obedience to certain views and to a vision of what-must-be-done. William McDougall (1871-1938) was a prominent figure in psychology and is now described as a "scientific racist," yet his views showed little of the stridency we normally associate with ideology:

Racial qualities both physical and mental are extremely stable and persistent, and if the experience of each generation is in any manner or degree transmitted as modifications of the racial qualities, it is only in very slight degree, so as to produce any moulding effect only very slowly and in the course of generations.

I would submit the principle that, although differences of racial mental qualities are relatively small, so small as to be indistinguishable with certainty in individuals, they are yet of great importance for the life of nations, because they exert throughout many generations a constant bias upon the development of their culture and their institutions. (Mathews, 1925, p. 151)

Similarly, the anthropologist William Graham Sumner (1840-1910) is described today as a "social Darwinist," even though the term was never applied to him during his lifetime or long after. He did believe in the struggle for existence: "Before the tribunal of nature a man has no more right to life than a rattlesnake; he has no more right to liberty than any wild beast; his right to pursuit of happiness is nothing but a license to maintain the struggle for existence..." (Sumner, 1913, p. 234). He saw such struggle, however, as an unfortunate constraint and not as a normative value. Efforts to abolish it would simply transfer it to other people:

The real misery of mankind is the struggle for existence; why not "declare" that there ought not to be any struggle for existence, and that there shall not be any more? Let it be decreed that existence is a natural right, and let it be secured in that way. If we attempt to execute this plan, it is plain that we shall not abolish the struggle for existence; we shall only bring it about that some men must fight that struggle for others. (Sumner, 1913, p. 222).

Yet his belief in the struggle for existence was not associated with imperialism and “might makes right.” Indeed, he considered imperialism a betrayal of America's traditions and opposed the Spanish-American War and America’s subsequent annexation of the Philippines. A class of plutocrats would, he felt, come into being and foment imperialist wars in the hope of securing government subsidies and contracts (Wikipedia, 2015).

Herbert John Fleure (1877-1969), a geographer and anthropologist, is similarly described today as a "scientific racist" who saw racial differentiation taking place even at the micro level of small communities:

[...] Fleure accepted the reality of racial differentiation even in Europe, where all the populations exhibit types of diverse origins living and maintaining those type characters side by side in spite of intermarriage and of absence of any consciousness of diversity. These various types, each with mental aptitudes and limitations that are in some degree correlated with their physique, make diverse contributions to the life of each people. (Barkan, 1992, p. 60)

Nonetheless, he condemned the "folly" of confusing such differentiation with language and nation states (Barkan, 1992, pp. 60-64). He also became a strong opponent of Nazism and attacked anti-Semitism in his lectures and articles (Kushner, 2008).

I could give other examples, but why bother? There was a spectrum of racial thinking that encompassed a wide range of scholars, many of whom were sympathetic to the plight of minorities. This variability is hardly surprising, given that racial thinking of one sort or another was typical of most educated people who came of age before the 1930s. Indeed, we are regularly treated to the discovery that some respected person, like Winston Churchill or Albert Schweitzer, was, in fact, a racist. This historical reality is embarrassing not just because the people in question are still role models, but also because it undermines the notion that antiracism freed us from an ideological straitjacket.

Conclusion

Words like "racism,""social Darwinism," and "hereditarianism" create the impression that a single monolithic ideology prevailed before the triumph of antiracism. Actually, the truth was almost the reverse. There was initially a wide spectrum of beliefs, as is normally the case before one belief pushes out its rivals and imposes its vision of reality. Antiracism triumphed because it was more ideological than its rivals; it possessed a unity of purpose that enabled it to neutralize one potential opponent after another. Often, the latter were unaware of this adversarial relationship and assumed they were dealing with a friendly ally.

History could have played out differently. Initially a tool in the struggle against Nazi Germany, antiracism became critically dependent on a postwar context of decolonization and Cold War rivalry. Without this favorable context, it would have had much more trouble seizing the moral high ground and locking down normal discourse. Its revival would have likely stalled at some point.

A world without antiracism could have still brought in laws against discrimination, particularly for the basics of life like housing and employment. But such efforts would have been driven not by ideology but by a pragmatic wish to create a livable society, like modern-day Singapore. In this alternate world, rational people would act rationally. They would not, for instance, be blindly sticking to antiracist principles—and insisting that everyone else do likewise—in the face of the demographic tsunami now sweeping out of Africa.

Social scientists in particular would be acting more rationally. They would not have to assume human sameness and arrange the facts accordingly. They would not face the same pressure to ignore embarrassing data, to choose the less likely explanation, and to keep quiet until ... until when? They would be free to work within the earlier, and more fruitful, paradigm that viewed human differences as a product of genes, culture, and gene-culture interaction. 

Such a paradigm could have absorbed findings on learning and conditioned reflexes, perhaps even better than the one we have now. Indeed, the current paradigm has trouble explaining why the effects of conditioning disappear at different rates, depending on what one has been conditioned to do. For instance, people lose a conditioned fear of snakes and spiders much more slowly than a conditioned fear of electrical outlets, even though the latter are more dangerous in current environments (Cook et al., 1986; Ohman et al., 1986). Conditioning, like learning in general, seems to interact not with a blank slate, but rather with pre-existing mental algorithms that have modifiable and non-modifiable sections.
 
Of course, this is not how history played out. We are living under an ideology that claims to be an anti-ideology while demanding the sort of conformity normally found in totalitarian societies. In the past, this contradiction largely went unnoticed, perhaps because the full extent of the antiracist project remained poorly known. Or perhaps people chose not to know. Increasingly, however, even the pretence of not knowing is becoming difficult. As French philosopher Alain Finkielkraut wrote, "the lofty idea of the 'war on racism' is gradually turning into a hideously false ideology. And this anti-racism will be for the 21st century what communism was for the 20th century" (Caldwell, 2009).


References 

Barkan, E. (1992). The Retreat of Scientific Racism: Changing Concepts of Race in Britain and the United States Between the World Wars, Cambridge University Press.
https://books.google.ca/books?id=-c8aSO-gnwMC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false

Caldwell, C. (2009). Reflections on the Revolution in Europe, Penguin.
https://books.google.ca/books?id=637_SgdPfnsC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false

JTA (1942). Dr. Franz Boas, Debunker of Nazi Racial Theories, Dies in New York, December 23  http://www.jta.org/1942/12/23/archive/dr-franz-boas-debunker-of-nazi-racial-theories-dies-in-new-york 

Kushner, T. (2008). H. J. Fleure: a paradigm for inter-war race thinking in Britain, Patterns of Prejudice, 42
http://www.tandfonline.com/doi/abs/10.1080/00313220801996006 

Leonard, T.C. (2009). Origins of the myth of social Darwinism: The ambiguous legacy of Richard Hofstadter's Social Darwinism in American Thought, Journal of Economic Behavior & Organization, 71, 37-51
https://www.princeton.edu/~tleonard/papers/myth.pdf

Mathews, B. (1925). The Clash of Colour. A Study in the Problem of Race. London: Edinburgh House Press.

Ohman et al. (1986). Face the Beast and Fear the Face: Animal and Social Fears as Prototypes for Evolutionary Analyses of Emotion, Psychophysiology, 23, 123-145.

Sumner, W.G. (1913). Earth-hunger and other essays, ed. Albert Galloway Keller, New Haven: Yale University Press. 

Taguieff, P-A. (2013). Dictionnaire historique et critique du racisme, Paris: PUF.

Wikipedia (2015). William Graham Sumner
http://en.wikipedia.org/wiki/William_Graham_Sumner

Birth of a word

$
0
0

Memorial service for Walther Rathenau (Wikicommons - German Federal Archives). His assassination introduced a new word into French and, shortly after, into English.
 
 

A reader has written me about my last post:

It is extremely unlikely that "racism" is an attempt at translating something like Völkismus. Between Hitler's preference for Rasse (race) over Völk and the fact that the Nazis drew on authors like Chamberlain (whose antisemitism would also tend towards privileging Rasse over Völk) and Gobineau (who wrote in French), there is no support to be found for a derivation that would make "racism" appear to be related to the less virulent of the two strains of German nationalism (the romantic-idealistic one which relished being able to point at linguistic differentiation - like Völk vs. populus/people/peuple - and speculating about vague semantic correlates thereof). 

The simple fact of the matter is that "racism" is not any kind of translation but just a combination of a widely used term with a lexologically highly productive suffix. Critical use of "racism" basically starts in the 1920s with Théophile Simar. And Hirschfeld, whose book Racism secured wider currency for the term, clearly wanted to espouse an anthropological concept just as much as Boas et. al. did, although he didn't offer any detailed discussion beyond his roundabout rejection of traditional ideas. BTW, Hirschfeld lectured in the U.S. in 1931. While he wrote his German manuscript in 1933/1934, he may well have employed the term "racism" years earlier.

The best authority on this subject is probably Pierre-André Taguieff, who seems to have read everything about racism, racialism, or colorism. He found that continuous use of the word “racism” began in the 1920s, initially in French and shortly after in English. There is little doubt about the historical context:

In a book published late in 1922, Relations between Germany and France, the Germanist historian Henri Lichtenberger introduced the adjective racist in order to characterize the "extremist,""activist," and "fanatical" elements in the circles of the German national and nationalist right as they had just recently been manifested by the assassination in Berlin, on June 24, 1922, of Walther Rathenau:


The right indignantly condemned Rathenau's murder and denied any connection with the murderers. A campaign was even planned to expel from the Nationalist party the agitators of the extreme right known as "Germanists" or "racists," a group (deutschvölkische) whose foremost leaders are Wulle, Henning and von Graefe, and whose secret inspirer is supposed to be Ludendorff.


[...] The context of the term's appearance is significant: the description of the behavior of the "German nationals" and more precisely the "activist,""extreme right" fraction. The adjective racist is clearly presented as a French equivalent of the German word völkische, and always placed in quotation marks. [...] The term, having only just appeared, is already charged with criminalizing connotations.

In 1925, in his reference book L'Allemagne contemporaine, Edmond Vermeil expressly reintroduced the adjective racist to translate the "untranslatable" German term völkischeand suggested the identification, which became trivial in the 1930s of (German) racism with nationalist anti-Semitism or with the anti-Jewish tendencies of the nationalist movement in Germany in the 1920s:
 

It is in this way that the National German Party has little by little split into two camps. The "racist" (völkische) extreme right has separated from the party. Racism claims to reinforce nationalism, to struggle on the inside against all that is not German and on the outside for all those who bear the name German. [...] (Taguieff, 2001, pp. 88-89)

 
The term “racist” thus began as an awkward translation of the German völkische to describe ultranationalist parties. Initially, the noun "racism" did not exist, just as there was no corresponding noun in German. It first appeared in 1925, and in 1927 the French historian Marie de Roux used it to contrast his country’s nationalism, based on universal human rights, with radical German nationalism, which recognized no existence for human rights beyond that of the Volk that created them. "Racism [...] is the most acute form of this subjective nationalism," he wrote. The racist rejects universal principles. He does not seek to give the best of his culture to "the treasure of world culture." Instead, the racist says: "The particular way of thinking in my country, the way of feeling that belongs to it, is the absolute truth, the universal truth, and I will not rest or pause before I have ordered the world by law, that of my birth place" (Taguieff, 2001, p. 91-94).

This was the original meaning of "racism," and it may seem far removed from the current meaning. Or maybe not. No matter how we use the word, the Nazi connotation is always there, sometimes lingering in the background, sometimes in plain view.

Conclusion

The noun "racism" was derived in French from an awkward translation of the German adjective völkische. Unlike the original source word, however, it has always had negative and even criminal connotations. It encapsulated everything that was going wrong with German nationalism in a single word and, as such, aggravated a worsening political climate that ultimately led to the Second World War.

When that war ended, the word "racism" wasn't decommissioned. It found a new use in a postwar context of decolonization, civil rights, and Cold War rivalry. Gradually, it took on a life of its own, convincing many people—even today—that the struggle against the Nazis never ended. They're still out there! 

It would be funny if the consequences weren't so tragic. Our obsession with long-dead Nazis is blinding us to current realities. In Europe, there have been many cases of Jews being assaulted and murdered because they are Jews. These crimes are greeted with indignation about how Europe is returning to its bad ways, and yet in almost every case the assailant turns out to be of immigrant origin, usually North African or sub-Saharan African. At that point, nothing more is said. One can almost hear the mental confusion.


Reference 

Frost, P. (2013). More thoughts. The evolution of a word, Evo and Proud, May 18
http://evoandproud.blogspot.ca/2013/05/more-thoughts-evolution-of-word.html 

Taguieff, P-A. (2001). The Force of Prejudice: On Racism and its Doubles, University of Minnesota Press.
https://books.google.ca/books?id=AcOG6Y9XG40C&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false

The monster in the mirror

$
0
0

Cyborg She, a love story about a female android and a shy young man (credit: Gaga Communications, for use in critical commentary)

 

Can humans and robots get along together? Actually, they already do in a wide range of applications from surgery to assembly lines. The question is more vexing when the robots are androids—human-like creatures that can recognize faces, understand questions, and behave as social, emotional, and affective beings. It is this aspect that troubles us the most, partly because it creates a power to manipulate and partly because it transgresses the boundary between human and nonhuman.

A manipulative female android appears in the recent British film Ex Machina. Ava exploits Caleb's sexual desire and sense of compassion, convincing him to help her escape from the research facility. She succeeds but leaves him behind, trapped in the building. This kind of negative portrayal runs through many sci-fi movies of the past four decades. In some, particularly the Terminatorseries (1984, 1991, 2003, 2009, 2015), androids are evil and seek to destroy mankind. In The Stepford Wives(1975), they are simply tools of wicked people: in a small town, the men conspire to murder their wives and replace them with lookalike android homemakers. In Westworld (1973), a Wild West theme park becomes a killing field when a gunslinger robot begins to take his role too seriously.

In other movies, the portrayal is more nuanced but still negative. Blade Runner (1982) assigns the human Rick Deckard the role of a bad good-guy who seeks out and kills android "replicants." Deckard hunts them down mercilessly, the only exception being Rachael, whom he rapes. Conversely, the replicants emerge as good bad-guys who show human mercy, particularly in the final scene when the last surviving one saves Deckard from death. This theme is further developed in AI (2001), where a couple adopt an android boy, named David, after their son falls victim to a rare virus and is placed in suspended animation. When their biological son is unexpectedly cured, and refuses to accept his new sibling, they decide to abandon David in a forest, much as some people get rid of unwanted pets. He meets another android, Gigolo Joe, who explains why David's love for his adoptive mother can never be reciprocated:

She loves what you do for her, as my customers love what it is I do for them. But she does not love you, David. She cannot love you. You are neither flesh nor blood. You are not a dog or a cat or a canary. You were designed and built specific, like the rest of us, and you are alone now only because they are tired of you, or they replaced you with a younger model or were displeased by something you said or broke.

In short, androids can love humans, but this love has a corrupting effect, making humans more callous and self-centered than ever. 

Some American and British movies have featured androids in unambiguously positive roles, like some of the droids in Star Wars (1977), Lisa in Weird Science(1985), Bishop in Aliens (1986), and Data in the TV series Star Trek: The Next Generation (1987-1994). Usually, however, androids are either villains or tragic heroes. One might conclude, therefore, that this dominant view is the logical one that emerges when thoughtful people weigh all the pros and cons.

And yet, we have the example of another cinematographic tradition where androids are viewed quite differently.

The Japanese exception?

Japan has diverged from Western countries in the way it depicts androids on screen. This is especially so in three productions that have appeared since the turn of the century:


This TV series begins in the near future with Hideki, a young man who lives on a remote farm. He has never had a girlfriend and decides to go to a prep school in Tokyo, where he can meet other people his age. On arriving in the big city, he is surprised to see so many androids, called “persocoms.” The life-sized ones are expensive, but many of his college friends have mini-persocoms—small fairy-like creatures, a bit larger than Tinkerbell, who can take email messages, help with schoolwork, provide GPS directions, or simply sing and dance to keep your spirits up.

One night, walking home, he sees a girl's body in the trash piled alongside the curb. He takes a closer look, realizes it's a persocom, and takes it home, where he manages to turn it on. But the persocom—a strangely beautiful girl with large eyes and floor-length hair—can speak only one word and knows nothing about the world. Hideki tries to teach her how to live in society, but he too is socially inept, so other people have to step in to provide help and advice.

From time to time, we see the girl with a children's book that Hideki bought to teach her how to read. It is about a place called Empty Town where people remain secluded in their homes and refuse to venture outside. At the end of each episode, we see this town and a female figure wandering through its deserted streets.

Chobits seems to have been made principally for a mature male audience, while containing elements that normally appear in magazines for teen and pre-teen girls. This is not surprising, given that it was created overwhelmingly by female storyboarders and animators.


Most of this movie is set in the present. There are obvious similarities with The Terminator (1984): an android arrives from the future in an electrical discharge; it has superhuman strength and, initially, no emotions; and near the end it must crawl around on its arms because it has lost the lower half of its body. But the similarities end there. The android is female and has come to befriend a shy young man, Kiro, who is spending his 20th birthday alone. She is, in fact, a creation of an older Kiro who wishes to change the course of his life. In this role, she saves him from a gunman who would otherwise leave him a cripple and, later, from a devastating earthquake. She also breaks his vicious circle of shyness/withdrawal, thus transforming him from a boy into a man.

The changes to Kiro are paralleled by changes to her. She develops feelings of jealousy and becomes conscious of her appearance; after being mutilated by a collapsing wall, she begs Kiro to leave, so that he will no longer see what she has become. In these final moments of her life, she tells Kiro that she can "feel his heart." The rest of the building then collapses on her, and when he later retrieves her remains from the rubble, he clings to them, overwhelmed by grief.


This TV series features a timid boy called Heita who attends a private high school. He feels a chasm between himself and the world of love, preferring to be alone in places like the school's science lab. One day, however, he enters the lab and finds the inanimate body of an android girl. When he touches her teeth, she comes to life and asks him to give her a name. He chooses “Kyuuto” because her serial number is Q10 … and because she’s cute.

She follows Heita everywhere, and the principal tries to head off a potential scandal by enrolling her at the school and making the boy her caretaker. Heita tells his science teacher that he doesn't want the job and asks her to turn the android off, but she simply smiles and says there is no going back. The rest of the series recounts the weird love that develops between Heita and Kyuuto.

A common theme

You may have noticed a common theme: male shyness. It's nothing new in Japanese society. Indeed, it seems to prevail in all societies where the father invests much time and energy in providing for his wife and children. In exchange, he wants to be sure that the children are his own. So monogamy is the rule, and something is needed to keep the same man and woman together.

In such a context, male shyness deters men from sexual adventurism, i.e., wandering from one woman to another. Of course, the shyness must not be so strong that it leaves a man with no mate at all. This is not a problem in traditional societies, where intermediaries can step in and help the process along. It becomes a major problem, however, in modern societies where each man is expected to be a sexual entrepreneur.

Male shyness is becoming pathological in today’s Japan. The pathology even has a name: hikikomori—acute withdrawal from all social relationships outside the family. Numbers are hard to come by, but such people may exceed over a million in Japan alone, with 70-80% of them being men (Furlong, 2008). These figures are really the tip of the iceberg, since many men can lead seemingly normal lives while having no intimate relationships.

A form of therapy?

When the Japanese talk about future uses of androids, they invariably talk about elder care or home maintenance. It is really only in movies and manga comics that the subject of loving relationships is explored, and this is where we see the greatest difference between Japanese and Westerners. The latter seem pessimistic, seeing such love as manipulative or corrupting. In contrast, the Japanese see it as beneficial, even therapeutic.

Who is right? Some insight may be gleaned from research on love dolls, which occupy an early stage of the trajectory that leads to affective androids. In a study of 61 love doll owners, Valverde (2012) found them to be no different from the general population in terms of psychosexual functioning and life satisfaction. In contrast, the rate of depression was much higher among individuals who did not own a love doll but were planning to buy one. It seems likely, then, that the dolls are enabling these men to achieve a healthier psychological state. We will probably see a similar therapeutic effect with affective androids. 

But will this psychological improvement help such men move on to real human relationships? After all, many of them will simply be too unattractive, too socially marginal, or too lacking in personality to make the transition. Others may prefer androids to real women. This point comes up in Chobits when a woman tells Hideki that she feels jealous of his android and its perfect beauty.

One thing is sure. No android, no matter how lifelike, can procreate. When Hideki is walking with a friend by a lake, he is warned that an android can never be as good as a real human. We then see a woman in a boat, with two young children. This fact also explains the convoluted ending of Cyborg She. There can be no happy ending until Kiro's life path is fully rectified, and this can happen only when he becomes a husband and father. Through a series of unusual events, the android's memory is transferred to a similar-looking woman who then travels back in time to meet Kiro after the earthquake. 

Although we will soon have androids that can recognize individual humans and respond to them affectively, there are no procreative models on the drawing board. This limitation will have to be recognized before we begin to use them for therapeutic purposes.

Two different paths

Why does Japan have a more positive attitude toward androids in particular and robots in general? Most observers put it down to the animist roots of the country’s religion, Shinto, which teaches that everything has a spirit, be it the sun, the moon, mountains, trees, or even man-made objects (Mims, 2010). In contrast, Christianity teaches that only humans have souls, so there is no moral difference between swatting a fly and killing an android. When Deckard rapes Rachael, he is merely masturbating. She loves him, but her love can only have a corrupting effect because humans of Christian heritage feel no need to reciprocate. 

This cultural explanation isn’t perfect. For one thing, the divergence between Japan and the West is less obvious the farther back in time you go (Anon, 2013). Before the 1970s, robots were generally likeable characters on the American big screen or small screen, from the Tin Man of The Wizard of Oz (1939) to the robot of Lost in Space (1965-1968). There was even a romance genre: in the seventh episode of The Twilight Zone (1959), a female android saves a man from the loneliness of solitary confinement.

The change of attitude among cineastes seems to have happened during the 1970s. Perhaps not coincidentally, the same decade saw a parallel change of attitude in the business community. Previously, with the West moving toward an increasingly high-wage economy, automation and robotization were considered inevitable, since there would be nobody available to do low-paying jobs. This attitude changed during the 1970s with the growing possibilities for outsourcing of high-wage manufacturing jobs to low-wage countries and, conversely, insourcing of low-wage workers into industries that could not outsource abroad (construction, services, etc.). This easier access to cheap labor made the business community less interested in robots, so much so that robotics research has largely retreated to military applications. There is very little research into use of robots as caregivers or helpmates. 

This new economic reality has spawned a strange form of Japan-bashing in the press, as in this Washington Post story:

There are critics who describe the robot cure for an aging society as little more than high-tech quackery. They say that robots are a politically expedient palliative that allows politicians and corporate leaders to avoid wrenchingly difficult social issues, such as Japan's deep-seated aversion to immigration, its chronic shortage of affordable day care and Japanese women's increasing rejection of motherhood.

"Robots can be useful, but they cannot come close to overcoming the problem of population decline," said Hidenori Sakanaka, former head of the Tokyo Immigration Bureau and now director of the Japan Immigration Policy Institute, a research group in Tokyo. "The government would do much better spending its money to recruit, educate and nurture immigrants," he said. (Harden, 2008)

Of course, this kind of argument could be stood on its head. Aren’t we using immigration as a means to evade the challenges of caring for an aging population and robotizing low-paying jobs out of existence? 

Conclusion

It is no longer fashionable to believe that economics can influence culture and ideology. Yet there seems to be some linkage between the growing indifference toward robots in our business community and the growing hostility toward them in our popular culture. In Japan, major corporations like Honda strive to rally popular opinion in favor of robotics. In the West, big business plays no such role and, if anything, has to justify its relative indifference. There is thus no organized faction that can push back against anti-robotic views when and if they arise.

So we will fail in robotics because we’re not trying very hard to succeed. This is one of those basic rules of life: if you don’t try, not much is going to happen.

But will the Japanese succeed? I cannot say for sure. I can only say there is a lot of pent-up demand for personal robots, especially androids with affective capabilities. Modern society is creating loneliness on a massive scale with its war on “irrational” and “repressive” forms of sociality—like the family and the ethny. I remember doing fieldwork among elderly people on Île aux Coudres and expecting no end of trouble with my stupid questions about attitudes toward skin colour in a traditional mono-ethnic environment. I needn’t have worried. The interviewees showed an unusual degree of interest in my questions and would talk for hours on end. Then I discovered these people typically went for days—sometimes weeks—with no human contact at all. And then others would tell me that so-and-so next door had committed suicide, not because of terminal illness but because of terminal loneliness.

Mark my words. When cyber-Tinkerbells start appearing in stores, people will come in droves to snatch them up like there’s no tomorrow. And many will also be snatching up the life-sized equivalents—even if they cost as much as a Lamborghini.


References 

Anon. (2013). Debunked: Japan's "Special Relationship with Robots", Home Japan
http://www.homejapan.com/robot_myth 

Chobits (2002). Japanese TV series, directed by Morio Asaka, 26 episodes
https://www.youtube.com/watch?v=ingYFsjgaZ4 

Cyborg She (2008). Japanese drama, directed and written by Kwak Jae-yong
https://www.youtube.com/watch?v=lO7OEAzZ4aU 

Furlong, A. (2008). The Japanese hikikomori phenomenon: acute social withdrawal among young people, The Sociological Review, 56, 309-325
http://onlinelibrary.wiley.com/doi/10.1111/j.1467-954X.2008.00790.x/full 

Harden, B. (2008). Demographic crisis, robotic cure? Washington Post, January 7
http://www.washingtonpost.com/wp-dyn/content/article/2008/01/06/AR2008010602023.html 

Mims, C. (2010). Why Japanese Love Robots (And Americans Fear Them), MIT Technology Review, October 12
http://www.technologyreview.com/view/421187/why-japanese-love-robots-and-americans-fear-them/ 

Q10 (2010). Japanese TV series, directed by Kariyama Shunsuke and Sakuma Noriyoshi, 9 episodes
https://www.youtube.com/watch?v=tE_NEjPjdSI 

Valverde, S.H. (2012). The modern sex doll-owner: a descriptive analysis, master's thesis, Department of Psychology, California State Polytechnic University.
http://digitalcommons.calpoly.edu/theses/849/

Imagining the future, imagining death

$
0
0

 
On Star Trek, African Americans were underrepresented among guest actors, who were just as likely to be part-Asian actresses like France Nuyen (Wikicommons)

 

Only six years separate the production of Logan's Run (1976) from that of Blade Runner(1982), yet those intervening years form a watershed in how science fiction imagined the future. The first movie depicts the year 2274. The setting is futuristic, and the people so beautiful that one significant detail may go unnoticed. Eventually, the penny drops—everyone is white! The future looks very different in the second movie. We’re only in the year 2019, and whites are already a minority in Los Angeles; indeed, if we exclude the replicants, there don't seem to be many left.

This change in our imagined future is especially noticeable if we compare pre-1980 movies with post-1980 remakes. In The Time Machine (1960), the future is inhabited by two races: the Eloi and the Morlocks. Both are descended from present-day humans, but only the Eloi still look human. Not only that, they have fair skin and blonde hair. It's the year 802701, and those folks are still around! The Eloi look a lot different in the 2002 remake: they are now a dark-skinned people of mixed Afro-Asian descent, in contrast to the pale Morlocks. This physical difference is absent from the original film and the book itself, which repeatedly describes the Eloi as fair-skinned: "[I was] surrounded by an eddying mass of bright, soft-colored robes and shining white limbs" (Wells, 1898, p. 24); "I would watch for [Weena’s] tiny figure of white and gold" (Wells, 1898, p.41); "I looked at little Weena sleeping beside me, her face white and starlike under the stars" (Wells, 1898, p. 57). In the remake, the only people who look approximately white are the Über-Morlocks ... and they feed on human flesh. A fair-skinned viewer would be torn between two conflicting responses: a desire to identify with the Über-Morlocks as People Who Look Like Me and a desire to hate them as morally worthless. This situation is almost the reverse of the original story line: the Time Traveller is misled by the familiar appearance of the Eloi and develops affection for them, even love, only to realize that they are as different from him as the hideous Morlocks.

Even before 1980, we see some awareness in sci-fi that whites would, one day, no longer have societies of their own. Star Trek (1966-1969) led the way in this direction; nonetheless, the ship’s crew looks overwhelmingly white, partly because the American population was still overwhelmingly white during those years and partly because of the small pool of African American actors. Very few of the latter appear in guest roles, which were just as often filled by part-Asian actresses like France Nuyen, born of a Vietnamese father and a Roma mother (Elaan of Troyius), or Barbara Luna, of mixed Filipino and European descent (Mirror, Mirror). This was the 1960s, when antiracism was still taking shape and partly driven, apparently, by a desire to see exotic-looking women.

All the same, those years saw a general tendency to raise the visibility of African Americans on both the big screen and the little screen. Sci-fi was no exception, particularly by the 1980s. In the Alien series (1979, 1986, 1992), the casts are multiracial, although whites still predominate. Just as significantly, the taboo against a non-white killing a white is broken, albeit in a seemingly acceptable way:

In Alien itself the representative of the company is an android named Ash (something white) - he is a white man who is not human. This is revealed when an African-American crew member pulls off Ash's head: the black man reveals the nothingness of the white man and destroys him by depriving him of his brain, the site of his spirit. The crew bring this severed head back to temporary electronic life to find out how the alien can be destroyed. He tells them that it is indestructible and one of the crew realizes that he admires it. 'I admire its purity', he says, adding in a cut to an extreme, intensifying close-up, 'unclouded by conscience, remorse or delusions of morality.' Purity and absence of affect, the essence of the aspiration of whiteness, said in a state of half-life by a white man who has never really been alive anyway. (Dyer, 2000)

It is really only with Blade Runner(1982) that popular culture began to acknowledge the imminence of white demise. We think of the 1980s as the Reagan Era, a time when White America pushed back after a long retreat during the previous two decades. In reality, the retreat picked up speed. The endgame was already apparent to anyone who gave it much thought, like Blade Runner's scriptwriters. Thus, in the year 2019, we see whites inhabiting a world that is no longer theirs, with some like Sebastian living alone within the decaying shell of their past—the grand but neglected building where most of the action takes place. The least pathetic white is Rachael, a replicant. She also seems the least WASP-looking with her dark hair and her family photos, which suggest a southern European, Armenian, or Jewish origin. The photos themselves are a lie—like the loner Deckard she has no real collective identity, but she does have an imagined one.

We now come to a common theme of love stories: how a fallen man is redeemed by the love of a woman. Here, the fallen man is Deckard—a remnant of a White America in terminal decline. The woman is Rachael, who wants to give him a future of love, marriage, and family, even though this prospect is no more viable than her own imaginary past.

Rachael offers the possibility of developing true emotions [...] The two dark 'whites' [Rachael and Gaff] offer something definite, real, physical to the nothingness of the indifferently fair white man. In the first version, Deckard and Rachael escape, the film ending with a lyrical (if naff) flight away from Los Angeles and perhaps Earth: the dark woman's discovery of true feeling (she weeps) redeems the fair - truly white - man's emptiness. This ending is absent from the 'director's cut'; the dark woman cannot redeem the fair man (Dyer, 2000)

Blade Runner is a film noir with no happy ending in the traditional sense. Even if the two of them did escape to build a life together, it's hard to see how this new life could evolve into anything more than two deracinated individuals with no past and no clear future. Can Rachael have children? Doubtful. It's also doubtful whether Deckard would want to settle down and become a family man. What would he do to support a family? Go back to hunting replicants?

The film does not address these questions. Nor should it. Whether you are for or against, white demise is something to be addressed collectively, and not at the level of individuals. This point is made in the writings of Richard Dyer and other postmodernists who welcome a future of collective death and feel that whites should come to terms with it:

Whites often seem to have a special relation with death, to yearn for it but also to bring it to others. [...] I have been wary of dwelling on the fearfulness - sometimes horrible, sometimes bleak - of the white association with death. To do so risks making whites look tragic and sad and thus comes perilously close to a 'me-too', 'we're oppressed', 'poor us' position that seems to equalise suffering, to ignore that active role of whites in promulgating inequality and suffering. It could easily be taken as giving us a let-out from acknowledging the privilege and effortless power of even the most lowly of those designated as white. Yet, if the white association with death is the logical outcome of the way in which whites have had power, then perhaps recognition of our deathliness may be the one thing that will make us relinquish it.

This sounds ominous. It strangely resembles what some people wrote in the 19th century about the disappearing American Indian and the disappearing Australian Aborigines. It was all for the best, some argued. As "savages" declined in numbers and disappeared, their lands would be resettled and better societies created. Today, whites are being seen in this light. Their departure from existence will purportedly bring an end to inequality and suffering, thus making the world a better place.

So goes the narrative, and few seem to be challenging it, no matter how outrageous it becomes.

Conclusion 

Imagined reality often foretells the real thing—not because the imaginers have a special knack for prediction, but because they end up playing an active role in shaping the future. The death of White America was already being imagined over three decades ago by people who, ultimately, had become reconciled to that fate and even looked forward to it. Moreover, this endgame seems to have struck a responsive chord among the public. As Dyer (2000) argues, "the death of whiteness is, as far as white identity goes, the cultural dominant of our times, that we really do feel we're played out."
 

References

King, C.R. and D.J. Leonard. (2004). Is neo white? Reading race, watching the trilogy, in M. Kapell and W.G. Doty (eds). Jacking in to the Matrix Franchise: cultural reception and interpretation, (pp. 32-46), A&C Black.
https://books.google.ca/books?id=ETf0her6UDgC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 

Dyer, R. (2000). Whites are nothing: Whiteness, representation and death, in I. Santaolalla (ed.) "New" Exoticisms: Changing Patterns in the Construction of Otherness, (pp. 135-155), Rodopi
https://books.google.ca/books?id=ew0q5AxMfkEC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 

Wells, H.G. (1898). The Time Machine, online edition
http://www.literaturepage.com/read/thetimemachine.html

Feeling the other's pain

$
0
0

 
In the Reign of Terror, by Jessie Macgregor (1891). We don’t respond equally to signs of emotional distress in other people (Wikicommons)



We like to think that all people feel empathy to the same degree. In reality, it varies a lot from one person to the next, like most mental traits. We are half-aware of this when we distinguish between "normal people" and "psychopaths," the latter having an abnormally low capacity for empathy. The distinction is arbitrary, like the one between "tall" and "short." As with stature, empathy varies continuously among the individuals of a population, with psychopaths being the ones we find beyond an arbitrary cut-off point and who probably have many other things wrong with them. By focusing on the normal/abnormal dichotomy, we lose sight of the variation that occurs among so-called normal individuals. We probably meet people every day who have a low capacity for empathy and who nonetheless look and act normal. Because they seem normal, we assume they are as empathetic as we are. They aren’t.

Like most mental traits, empathy is heritable, its heritability being estimated at 68% (Chakrabarti and Baron-Cohen, 2013). It has two distinct components: cognitive empathy and affective empathy. Some researchers identify a third component, pro-social behavior, but its relationship to the other two seems tangential.

Cognitive empathy appears to be the evolutionarily older component of the two. It is the capacity to understand how another person is feeling and then predict how different actions will affect that person’s emotional state. But this capacity can be used for selfish purposes. Examples are legion: the con artist; many telemarketers; the rapist who knows how to charm his victims ...

Affective empathy is the younger component, having developed out of cognitive empathy. It is the capacity not just to understand another person's emotional state but also to identify with it. A person with high affective empathy will try to help someone in distress not because such help is personally advantageous or legally required, but because he or she is actually feeling the same distress.

Affective empathy may have initially evolved as a means to facilitate relations between a mother and her children. Later, and to varying degrees, it became extended to other human relationships. This evolutionary trajectory is perceptible in young children:

Children do not display empathic concern toward all people equally. Instead, they show bias toward individuals and members of groups with which they identify. For instance, young children of 2 years of age display more concern-related behaviors toward their mother than toward unfamiliar people. Moreover, children (aged 3-9 years) view social categories as marking patterns of interpersonal obligations. They view people as responsible only to their own group members, and consider within-group harm as wrong regardless of explicit rules, but they view the wrongness of between-group harm as contingent on the presence of such rules. (Decety and Cowell, 2014)

Similarly, MRI studies show that adults are much more likely to experience emotional distress when they see loved ones in pain than when they see strangers in pain. A stranger in distress will evoke a response only to the degree that the observer has a high capacity for affective empathy. The higher the capacity the more it will encompass not only loved ones but also less related individuals, including total strangers and nonhumans:

Humans can feel empathic concern for a wide range of 'others', including for nonhuman animals, such as pets (in the Western culture) or tamagotchi (in Japan). This is especially the case when signs of vulnerability and need are noticeable. In support of this, neural regions involved in perceiving the distress of other humans, such as the anterior cingulate cortex and insula, are similarly activated when witnessing the distress of domesticated animals (Decety and Cowell, 2014)

While we associate affective empathy with morality, the two are not the same, and there are situations where the two come into conflict. In most societies, kinship is the main organizing principle of social relations, and morality affirms this principle by spelling out the duties to one's parents, one's kin, and one's ethny. The importance of kinship may be seen in the Ten Commandments, which we wrongfully assume to be universal in application. We are told we must not kill, steal, lie, or commit adultery if the victims are "thy neighbor," which is explained as meaning "the children of thy people" (Leviticus 19:18). High-empathy individuals may thus subvert morality if they view all human distress as being equal in value. At best, they will neglect loved ones in order to help an indefinitely large number of needy strangers. At worst, strangers may develop strategies to exploit high-empathy individuals, i.e., to milk them for all they are worth.

Mapping empathy in the human brain

Empathy appears to arise from specific mechanisms in the brain, and not from a more general property, like general intelligence. It is produced by a sequence of mental events, beginning with "mirror neurons" that fire in tandem with the observed behavior of another person, thereby generating a mental model of this behavior. Copies of the model are sent elsewhere in the brain to decode the nature and purpose of the behavior and to predict the sensory consequences for the observed person. Affective empathy goes further by feeding these predicted consequences into the observer's emotional state (Carr et al., 2003).

Recent MRI research has confirmed that empathy is associated with increased development of certain regions within the brain. Individuals who score high on cognitive empathy have denser gray matter in the midcingulate cortex and the adjacent dorsomedial prefontal cortex, whereas individuals who score high on affective empathy have denser gray matter in the insula cortex (Eres et al.,2015). A high capacity for affective empathy is also associated with a larger amygdala, which seems to control the way we respond to facial expressions of fear and other signs of emotional distress (Marsh et al., 2014).

Can these brain regions be used to measure our capacity for affective empathy? Two studies, one American and one English, have found that "conservatives" tend to have a larger right amygdala (Kanai et al.,2011; Schreiber et al., 2013). This has been spun, perhaps predictably, as proof that the political right is fear-driven (Hibbing et al., 2014). A likelier explanation is that "conservatives" are disproportionately drawn from populations that have, on average, a higher capacity for affective empathy. 

Do human populations vary in their capacity for affective empathy?

Is it possible, then, that this capacity varies among human populations, just as it varies among individuals? I have argued that affective empathy is more adaptive in larger, more complex societies where kinship obligations can no longer restrain behavior that seriously interferes with the ability of individuals to live together peacefully and constructively (Frost, 2015). Whereas affective empathy was originally expressed mainly between a mother and her children, it has become progressively extended in some populations to a wider range of interactions. This evolutionary change may be compared to the capacity to digest milk sugar: initially, this capacity was limited to early childhood, but in dairy cattle cultures it has become extended into adulthood.

I have also argued that this evolutionary change has gone the farthest in Europeans north and west of the Hajnal Line (Frost, 2014a). In these populations, kinship has been a weaker force in organizing social relations, at least since the early Middle Ages and perhaps since prehistoric times. There has thus been selection for mechanisms, like affective empathy, that can regulate social interaction between unrelated individuals. This selection may have intensified during two time periods:

- An initial period corresponding to the emergence of complex hunter/fisher/gatherers during the Mesolithic along the shores of the North Sea and the Baltic. Unlike other hunter-gatherers, who were typically small bands of individuals, these people were able to form large coastal communities by exploiting abundant marine resources. Such communities were beset, however, by the problem of enforcing rule compliance on unrelated people, the result being strong selection for rule-compliant individuals who share certain predispositions, namely affective empathy, proneness to guilt, and willingness to obey moral rules and to expel anyone who does not (Frost, 2013a; Frost, 2013b).

- A second period corresponding to the spread of Christianity among Northwest Europeans, particularly with the outbreeding, population growth, and increase in manorialism that followed the Dark Ages (hbd chick, 2014). The result was a "fruitful encounter" between the two: on the one hand, Christianity, with its emphasis on internalized morality, struck a responsive chord in these populations; on the other hand, the latter modified Christianity, increasing its emphasis on faith, compassion, and original sin (Frost, 2014b).

Conclusion 

Recent research has brought much insight into the nature of empathy, which should no longer be viewed as being simply a noble precept. We now understand it as the outcome of a sequence of events in specific regions of the brain. We have also learned that individuals vary in their capacity for empathy and that most of this variability is heritable, as is the case with most mental traits. Moreover, empathy has two components—cognitive and affective—and the strength of one in relation to the other likewise varies. Although we often consider affective empathy to be desirable, it can have perverse and even pathological effects in some contexts.


References

Carr, L., M. Iacoboni, M-C. Dubeau, J.C. Mazziotta, and G.L. Lenzi. (2003). Neural mechanisms of empathy in humans: A relay from neural systems for imitation to limbic areas, Proceedings of the National Academy of Sciences (USA), 100, 5497-5502.
http://www.ucp.pt/site/resources/documents/ICS/GNC/ArtigosGNC/AlexandreCastroCaldas/7_CaIaDuMaLe03.pdf 
Chakrabarti, B. and S. Baron-Cohen. (2013). Understanding the genetics of empathy and the autistic spectrum, in S. Baron-Cohen, H. Tager-Flusberg, M. Lombardo. (eds). Understanding Other Minds: Perspectives from Developmental Social Neuroscience, Oxford: Oxford University Press.
http://books.google.ca/books?hl=fr&lr=&id=eTdLAAAAQBAJ&oi=fnd&pg=PA326&ots=fHpygaxaMQ&sig=_sJsVgdoe0hc-fFbzaW3GMEslZU#v=onepage&q&f=false 

Decety, J. and J. Cowell. (2014). The complex relation between morality and empathy, Trends in Cognitive Sciences, 18, 337-339
http://spihub.org/site/resource_files/publications/spi_wp_135_decety.pdf 

Eres, R., J. Decety, W.R. Louis, and P. Molenberghs. (2015). Individual differences in local gray matter density are associated with differences in affective and cognitive empathy, NeuroImage, 117, 305-310.
http://www.sciencedirect.com/science/article/pii/S1053811915004206 

Frost, P. (2013a). The origins of Northwest European guilt culture, Evo and Proud, December 7
http://evoandproud.blogspot.ca/2013/12/the-origins-of-northwest-european-guilt.html 

Frost, P. (2013b). Origins of Northwest European guilt culture, Part II, Evo and Proud, December 14
http://evoandproud.blogspot.ca/2013/12/origins-of-northwest-european-guilt.html 

Frost, P. (2014a). Compliance with Moral Norms: a Partly Heritable Trait? Evo and Proud, April 12
http://evoandproud.blogspot.ca/2014/04/compliance-with-moral-norms-partly.html

Frost, P. (2014b). A fruitful encounter, Evo and Proud, September 26
http://evoandproud.blogspot.ca/2014/09/a-fruitful-encounter.html 

Frost, P. (2015). Two paths, The Unz Review, January 24
http://www.unz.com/pfrost/two-paths/ 

hbd chick (2014). Medieval manorialism’s selection pressures, hbd chick, November 19
https://hbdchick.wordpress.com/2014/11/19/medieval-manorialisms-selection-pressures/ 

Hibbing, J.R., K.B. Smith, and J.R. Alford. (2014). Differences in negativity bias underlie variations in political ideology, Behavioral and Brain Sciences, 37, 297-350
http://www.geoffreywetherell.com/Hibbing%20et%20al%20paper%20and%20commentaries%20(1).pdf 

Kanai, R., T. Feilden, C. Firth, and G. Rees. (2011). Political orientations are correlated with brain structure in young adults, Current Biology, 21, 677 - 680.
http://www.cell.com/current-biology/abstract/S0960-9822(11)00289-2 

Keysers, C. and V. Gazzola. (2014). Dissociating the ability and propensity for empathy, Trends in Cognitive Sciences, 18, 163-166.
http://www.cell.com/trends/cognitive-sciences/pdf/S1364-6613(13)00296-9.pdf 

Marsh, A.A., S.A. Stoycos, K.M. Brethel-Haurwitz, P. Robinson, J.W. VanMeter, and E.M. Cardinale. (2014). Neural and cognitive characteristics of extraordinary altruists, Proceedings of the National Academy of Sciences, 111, 15036-15041.
http://www.pnas.org/content/111/42/15036.short 

Schreiber, D., Fonzo, G., Simmons, A.N., Dawes, C.T., Flagan, T., et al. (2013). Red Brain, Blue Brain: Evaluative Processes Differ in Democrats and Republicans. PLoS ONE 8(2): e52970.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0052970 

Gender reassignment of children. Does it really help?

$
0
0

"Flower boy" (on the right) - In 70-80% of cases, gender confusion will clear up on its own (Wikicommons: Recoplado).

 

I remember feeling some attraction to girls in Grade 2, but it really wasn't until Grade 8 that everything fell into place. I'm talking about puberty. Before high school, I was a boy and not a young man.

I didn't consider myself abnormal. Yes, many boys in Grade 8 had deeper voices, as well as signs of facial hair, but just as many did not, and a few would not have been "sexually functional." As for the earlier grades, certainly before Grade 7, most of us could have passed for little girls—just change the clothing, the hairstyle, and voilà!

Today, puberty is starting earlier. Ontario schools will begin explaining it in ... Grade 4. This falling age is largely due to the changing ethnic and racial origins of the student population, as well as things like overeating (in the case of girls) and perhaps our more sexualized culture.

Nonetheless, a lot of boys remain pre-pubertal throughout most of primary school, and some may have trouble coming to terms with their male identity. They experience what is called “gender confusion.” This is hardly surprising. Testosterone levels are low before puberty, and some boys, especially the ones who have been less androgenized in the womb, may genuinely feel like a girl. I also suspect that modern culture makes things worse by creating expectations that even adult males have trouble meeting. Go to any fitness center and you'll see plenty of young men trying to bring their bodies into line with the "rippled look."

Gender confusion, known medically as gender identity disorder, affects children of both sexes but boys much more so, at least in North America. One clinic reported a ratio of 6.6 boys for each girl, the sex imbalance being attributed partly to greater intolerance of feminine behavior in boys (Zucker et al., 1997). This disorder seems to be partly heritable, although we face a similar problem of perspective here as with the referral statistics (Heylens et al., 2012). To what degree does the heritable component reside in how these children objectively behave, and not in one behavior that may or may not alarm another person, usually a parent? In practice, it’s the latter. It’s whatever behavior that makes a parent bring the child to a clinician’s office.

Gender reassignment

We now come to the issue of medical treatment, specifically "gender reassignment." This treatment has recently been condemned by Dr. Paul McHugh, the former psychiatrist-in-chief for Johns Hopkins Hospital:

Then there is the subgroup of very young, often prepubescent children who notice distinct sex roles in the culture and, exploring how they fit in, begin imitating the opposite sex. Misguided doctors at medical centers including Boston's Children's Hospital have begun trying to treat this behavior by administering puberty-delaying hormones to render later sex-change surgeries less onerous—even though the drugs stunt the children's growth and risk causing sterility. (McHugh, 2015)

Is treatment really necessary? McHugh points out: "When children who reported transgender feelings were tracked without medical or surgical treatment at both Vanderbilt University and London's Portman Clinic, 70%-80% of them spontaneously lost those feelings."

McHugh has been accused by the transgender community of misrepresenting the facts:

McHugh also mischaracterizes the treatment of gender nonconforming children. As McHugh states, most gender nonconforming children do not identify as transgender in adulthood.  However, those who receive puberty blocking drugs do not do so until puberty, when trans identity is likely to persist. These drugs allow adolescents and their parents to work with doctors to achieve the best outcome. This approach was demonstrated to be successful in research in the Netherlands before being adopted widely in the U.S. (WPATH, 2015) 

The above text is disingenuous in two ways. First, puberty-blocking drugs are not administered until puberty for an obvious reason: they would be ineffective earlier. The decision to use them, however, is made at an earlier time and often much earlier. Second, these drugs keep hormonal levels from rising, thus maintaining the boy or girl in the same hormonal state and possibly in the same state of gender confusion. Logically, one should wait a few years to see what effect puberty might have.

Is the use of these drugs legitimate? We’re talking about a radical intervention in the normal process of maturation, and this intervention begins before the age of consent, i.e., 16 years of age in most Western countries. Moreover, the eventual gender reassignment will never be complete. Although it’s possible to turn a male into a semblance of a female, such a “female” can never bear children. This isn’t a minor point, given that many male transsexuals wish to maintain a male heterosexual orientation, even to the point of marrying and becoming fathers.

For all these reasons, use of these drugs should be delayed until adulthood, when consent becomes morally defendable, when the risks of sterility are lower, and when the gender confusion may prove to be transitory.

A boy is not a little man

The transgender community likes to talk a good talk about "gender fluidity." Ironically, such fluidity is reduced by gender reassignment, which imposes a relatively unchanging adult dichotomy on pre-pubertal individuals who are going through rapid physical and psychological change. This brings us to a second irony. The transgender community complains about how it was once medically pathologized. Yet here it is pathologizing cases of gender confusion that are not unusual among young children and that are consistent with normal child development.

We should remember that both sexes begin with a body plan that is more female than male. This plan is modified at two points of the life cycle: first, in the womb, when the body’s tissues are primed by a surge of androgens or estrogens; and then at puberty, when boys and girls diverge in the levels of their circulating sex hormones, which in turn trigger profound changes in growth and development.

This truth was known to our ancestors. As late as the early 20th century, people accepted that little boys are more akin to little girls than to grown men. This was why both sexes would be dressed in female clothing until school age, and a mother would often boast that her little boy was as pretty as a girl.

[…] infants and small children had for hundreds of years been dressed alike, in frocks, so that family portraits from previous centuries made it difficult to tell the young boys from the girls. “Breeching,” as a rite of passage, was a sartorial definition of maleness and incipient adulthood, as, in later periods, was the all-important move from short pants to long. Gender differentiation grew increasingly desirable to parents as time went on. By the closing years of the twentieth century the sight of little boys in frilly dresses has become unusual and somewhat risible; a childhood photograph of macho author Ernest Hemingway, aged almost two, in a white dress and large hat festooned with flowers, was itself the focus of much amused critical commentary when reproduced in a best-selling biography—especially when it was disclosed that Hemingway’s mother had labelled the photograph of her son “summer girl.”  (Garber, 1997, pp. 1-2)

Hemingway hated those baby pictures, as well as the stories about how his mother would call him “Ernestine” and tell strangers that he and his sister were twin girls. During her declining years, he threatened to cut off his financial support if she ever gave an interview about his childhood (Onion, 2013; Winer, 2008). He saw her as the typical Victorian mother who sought to momify and symbolically castrate her male offspring. With other writers of his time, particularly psychologists and advice columnists, he helped bring about a reform of sexual conventions that, among other things, would sweep away the custom of cross-dressing little boys.

(see here for an early childhood photo of Hemingway and here for similar photos of H.P. Lovecraft

I remember how I felt seeing such photos when doing research on my family tree. What the?? Today, I feel differently: this cross-dressing strikes me as being healthy, even beautiful in its own way. It avoids the problem of imposing male identity too early in life and thereby forcing slower-developing boys to choose between the identity imposed by society and the one generated by their own mental state—which may still be insufficiently male. It is this situation, and the resulting gender confusion, that is now putting many boys at risk of gender reassignment. Yet there’s nothing wrong with most of them. They just need more time to grow up.

As an extreme example, let’s take the case of "pseudohermaphrodites"—males who look female at birth because their penis resembles a clitoris and because their testes remain inside the body. They are typically raised as girls until puberty, at which time the penis grows in size, the testes descend into the scrotum, and they become like men physically and psychologically. When 18 pseudohermaphrodites were studied in the Dominican Republic, it was found that 16 of them had made the transition from girlhood to manhood with no evidence of psychosexual maladjustment (Imperato-Mcginley et al., 1979). A similar situation often arose among Canada’s Inuit whenever a newborn received the name of a deceased relative. If the child was a boy and the relative a woman, it would be raised as a girl until puberty and as a man thereafter. Such individuals became not only husbands and fathers but also respected shamans (Saladin d'Anglure, 2005).

In short, gender confusion in childhood poses no threat to normal child development. Indeed, whether we acknowledge it or not, all boys start off being more like little girls than the men they will become. This “early girlhood” may actually play a key role in their psychosexual development, and our ancestors might have had good reasons to believe that boyhood begins later. But that raises a troubling question: by trying to masculinize this early phase of life, have we opened the door to unknown consequences?

So if you have a young boy who’s confused about his gender identity, the chances are very good that he’ll successfully transition to manhood ... as long as he’s not given puberty-blocking drugs. This is not a medical condition that needs treatment.

References 

Garber, M.B. (1997). Vested Interests: Cross-Dressing and Cultural Anxiety, Psychology Press.
https://books.google.ca/books?id=eeASHasS0oUC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 

Heylens, G., G. De Cuypere, K.J. Zucker, C. Schelfaut, E.Elaut, H. Vanden Bossche, E. De Baere, and G. T’Sjoen. (2012). Gender identity disorder in twins: A review of the case report literature, The Journal of Sexual Medicine, 9, 751-757.
http://onlinelibrary.wiley.com/doi/10.1111/j.1743-6109.2011.02567.x/abstract 

Imperato-Mcginley, J., R.E. Petersen, T. Gautier, and E. Sturia. (1979). Male pseudohermaphroditism secondary to 5a-reductase deficiency—A model for the role of androgens in both the development of the male phenotype and the evolution of a male gender identity, Journal of Steroid Biochemistry, 11, 637-645.
http://www.sciencedirect.com/science/article/pii/0022473179900931 

McHugh, P. (2015). Transgender surgery isn't the solution, The Wall Street Journal, June 12
http://www.wsj.com/articles/paul-mchugh-transgender-surgery-isnt-the-solution-1402615120 

Onion, R. (2013). Pages from Hemingway’s baby books, Slate, July 23
http://www.slate.com/blogs/the_vault/2013/07/23/hemingway_scrapbooks_grace_hemingway_s_records_of_son_ernest_hemingway_s.html 

Saladin d'Anglure, B. (2005). The 'Third Gender' of the Inuit, Diogenes, 52, 134-144.
http://dio.sagepub.com/content/52/4/134.short 

Winer, A. (2008). Why Hemingway used to wear women’s clothing, Mental_floss, December 18
http://mentalfloss.com/article/20396/why-hemingway-used-wear-womens-clothing 

WPATH (2015). Wall Street Journal Editorial Critiques Transgender Health July 2, 2014
http://www.wpath.org/site_page.cfm?pk_association_webpage_menu=1635&pk_association_webpage=4905 

Zucker, K.J., S.J. Bradley, and M. Sanikhani. (1997). Sex differences in referral rates of children with gender identity disorder: some hypotheses, Journal of Abnormal Child Psychology, 25, 217-227.
http://link.springer.com/article/10.1023/A:1025748032640#page-1

Young, male, and single

$
0
0

 
The Babylonian Marriage Market, by Edwin Long (1829-1891). There are too many young men on the mate market, particularly in the White American community.

 

It sucks being young, male, and single. Don't think so? Go to the Interactive Singles Map of the United States and see how it looks for the 20 to 39 age group. Almost everywhere single men outnumber single women.

And the real picture is worse. For one thing, the imbalance is greater among singles without children. This is not a trivial factor, since single mothers are "single" only in the sense of being available for sexual relations. They are still raising offspring from a previous relationship and many are not interested in having more children.

Then there's polygamy—or "polyamory," to use the preferred term—where a minority of men controls sexual access to a larger number of women. If we compare the 1940-1949 and 1970-1979 cohorts of American adults, we find an increase in the number of median lifetime partners from 2.6 to 5.3 among women and from 6.7 to 8.8 among men (Liu et al., 2015). Because this figure is more variable for men than for women, young women are more likely to be sexually active than young men. This is crudely seen in infection rates for chlamydia—the most common sexually transmitted disease. Hispanic Americans still show the traditional pattern of greater sexual activity among men than among women, the rates being 7.24% of men and 4.42% of women. White Americans display the reverse: 1.38% of men and 2.52% of women (Miller et al., 2004).

Finally, there’s a racial angle. This sex ratio is more skewed among White Americans than among African Americans, mainly because the latter have a lower sex ratio at birth and a higher death rate among young men.

It's hard to avoid concluding that a lot of young white men are shut out of the marriage market ... or any kind of heterosexual relationship. This wife shortage was once thought to be temporary, being due to baby-boomer men getting divorced and marrying younger women from the smaller "baby bust" cohort. With time, they would get too old to compete with young men, and the problem should resolve itself.

Today, the crest of the baby boom is entering the seventh decade of life, yet the update to the Interactive Singles Map shows no change to the gender imbalance. So what gives? It appears that demographers have focused too much on the baby-boomer effect and not enough on other factors that matter just as much and, more importantly, show no signs of going away. These factors can be summarized as follows.

Re-entry of older men into the mate market

We have a mate market where 20 to 50 year old men are competing for 20 to 40 year old women. That in itself is nothing new. But something else is.

The baby boom eclipsed an equally important but longer-term trend: more and more men are living past the age of 40. With or without the baby boom, we’ll still see large numbers of older men getting divorced and marrying younger women. The cause isn’t just liberal divorce laws. It’s also the fact we have far more older guys out there as a proportion of the population.

Sure, we will also see younger men pairing up with "cougars" but there are limits to that option, as noted in a New Zealand study:

The male partner may want to partner up with someone younger or have children, which may not be possible with an older woman (for physical reasons or because she chooses not to have (more) children). The younger male partner may not want to become a step-father to existing children. Research has shown that childbearing can be the ultimate deal breaker in this kind of relationship.(Lawton and Callister, 2010)

Persistence of the imbalanced sex ratio at birth

About 105 males are born for every 100 females among people of European origin. This sex ratio used to decline to parity during childhood because of higher infantile mortality among boys. It then declined even farther in early adulthood because of war, industrial accidents, and other hazards. This isn't the distant past. If you talk with women who came of age in the postwar era, they will tell you about their fears of remaining single past the age of thirty. At that age, very few single men were left to go around.

Well, things have changed. The skewed sex ratio at birth is now persisting well into adulthood, thanks to modern medicine and the relative peace that has prevailed since 1945. Women begin to outnumber men only in the 35-39 age group in the United States and in the 40-44 age group in the United Kingdom.

Equalization of male and female same-sex preference

Historically, same-sex preference was more common among men than among women. This gender gap appears to be closing, according to a recent study:

The percent distributions were quite similar for men and women; however, a higher percentage of men identified as gay (1.8%) compared with women who identified as gay/lesbian (1.4%), and a higher percentage of women identified as bisexual (0.9%) compared with men (0.4%). (CDCP, 2014, p. 5) 

Disparities in outmarriage

At present, there are more White American women outmarrying than White American men, particularly in younger age groups. This disparity is mainly in marriages with African American men, there being no gender difference in marriages with Hispanic Americans and the reverse gender difference in marriages with Asian Americans (Jacobs and Labov, 2002; Passel et al., 2010). Overall, this factor further skews the ratio of young single men to young single women in the White American community. 

This disparity isn't new. What is new is its extent, for both legal and common-law marriages. An idea may be gleaned from statistics on children born to White American women, specifically the proportion fathered by a non-White partner. For the U.S. as a whole the proportion in 2013 was between 11% and 20% (the uncertainty is due to 190,000 births for which the father's race was not stated). By comparison, the proportion in 1990 was between 5% and 13% (Centers for Disease Control and Prevention, 2013; see also Silviosilver, 2015).

Whenever this issue comes up for discussion, there are often reassurances that the disparity will disappear in a post-racial world that has been cleansed of "White privilege." I'm not so sure. The European female phenotype seems to be very popular, and this was so even when white folks were geopolitical weaklings. Today, the term “white slavery” is merely a synonym for prostitution, but it originally meant the enslavement of fair-skinned women for sale to clients in North Africa, the Middle East, and South Asia.  At the height of this trade, between 1500 and 1650, over 10,000 Eastern Europeans were enslaved each year for export (Kolodziejczyk, 2006; Skirda, 2010). The overwhelming majority were young women and pre-pubertal boys who were valued for their physical appearance. And yet they were powerless.

No, I don't think this kind of preference will disappear as whites lose "privilege."

Exit strategies

So more and more young men are being left on the shelf, particularly in White America. How do they cope? Mostly by turning to porn from Internet websites, videocassettes, or magazines. Love dolls are another option and may grow in popularity as they become more human-like, not only physically but also in their ability to talk and interact.

Another option is outmarriage. In the past, this trend largely concerned older men marrying East Asian or Hispanic women, but we’re now seeing plenty of young men outmarrying via Internet dating sites. Despite the local supply of single women in the African American community, there is a much stronger tendency to look abroad, generally to women in Eastern Europe, South America, or East Asia.

Then there's gender reassignment, which means either entering the other side of the mate market or tapping into the lesbian market. It’s a viable strategy, all the more so because many white boys can be turned into hot trans women. I'm not saying that some young men actually think along those lines, but gender reassignment is functioning that way.

Finally, there's "game." My attitude toward game is like my attitude toward gender reassignment. Both are attempts to push the envelope of phenotypic plasticity beyond its usual limits, and neither can fully achieve the desired result. A lot of boys aren't wired for game, and there are good reasons why, just as there are good reasons why some people are born male. Male shyness isn't a pathology. It's an adaptation to a social environment that values monogamy and high paternal investment while stigmatizing sexual adventurism. Our war on male shyness reflects our perverse desire to create a society of Don Juans and single mothers.

But if game works, why not? Whatever floats your boat.

Conclusion

Ideally, this gender imbalance should be dealt with at the societal level, but I see little chance of that happening in the near future. If anything, public policy decisions will probably worsen the current imbalance. Changes to public policy generally result from a long process that begins when people speak up and articulate their concerns, yet it's unlikely that even this first step will be taken any time soon. Young single men prefer to remain silent and invent nonexistent girlfriends. They also tend to be marginal in the main areas of discourse creation, like print and online journalism, TV, film, and radio production, book writing, etc. Leaf through any magazine, and you'll probably see more stuff about the problems of single women.

So this imbalance will likely continue to be addressed at the individual level through individual strategies.

References 

Centers for Disease Control and Prevention. (2014). Sexual Orientation in the 2013 National Health Interview Survey: A Quality Assessment, Vital and Health Statistics, 2(169), December
http://www.cdc.gov/nchs/data/series/sr_02/sr02_169.pdf 

Centers for Disease Control and Prevention. (2013). Vital Statistics Online
http://www.cdc.gov/nchs/data_access/Vitalstatsonline.htm (for discussion, see Silviosilver, 2015 http://www.unz.com/pfrost/the-last-push-back-against-liberalism/#comment-896920) 

Jacobs, J.A. and T.B. Labov. (2002). Gender differentials in intermarriage among sixteen race and ethnic groups, Sociological Forum, 17, 621-646.
http://link.springer.com/article/10.1023/A:1021029507937 

Kolodziejczyk, D. (2006). Slave hunting and slave redemption as a business enterprise: The northern Black Sea region in the sixteenth to seventeenth centuries, Oriente Moderno, 86, 1, The Ottomans and Trade, pp. 149-159.
http://www.jstor.org/discover/10.2307/25818051?sid=21105312761261&uid=3737720&uid=3739448&uid=2&uid=4 

Lawton, Z. and P. Callister. (2010). Older Women-Younger Men Relationships: the Social Phenomenon of 'Cougars'. A Research Note, Institute of Policy Studies Working Paper 10/02
http://ips.ac.nz/publications/files/be0acfcb7d0.pdf 

Liu, G., S. Hariri, H. Bradley, S.L. Gottlieb, J.S. Leichliter, and L.E. Markowitz. (2015). Trends and patterns of sexual behaviors among adolescents and adults aged 14 to 59 years, United States, Sexually Transmitted Diseases, 42, 20-26.
http://journals.lww.com/stdjournal/Abstract/2015/01000/Trends_and_Patterns_of_Sexual_Behaviors_Among.6.aspx 

Miller, W.C., C.A. Ford, M. Morris, M.S. Handcock, J.L. Schmitz, M.M. Hobbs, M.S. Cohen, K.M. Harris, and J.R. Udry. (2004). Prevalence of chlamydial and gonococcal infections among young adults in the United States, JAMA, 291, 2229-2236.
http://jama.jamanetwork.com/article.aspx?articleid=198722

Passel, J.S., W. Wang, and P. Taylor. (2010). One-in-seven new U.S. marriages is interracial or interethnic, Pew Research Center, Social & Demographic Trends,
http://www.pewsocialtrends.org/2010/06/04/ii-overview-2/

Skirda, A. (2010). La traite des Slaves. L'esclavage des Blancs du VIIIe au XVIIIesiècle, Paris, Les Éditions de Paris Max Chaleil. 

Soma, J. (2013). Interactive Singles Map
http://jonathansoma.com/singles/
 

The Jews of West Africa?

$
0
0

Bronze vessel in the form of a snail shell, 9th century, Igbo-Ukwu (Wikicommons). The Igbo developed metallurgy much earlier than the rest of West Africa.

 

There has been much talk here about Chanda Chisala's article "The IQ gap is no longer a black and white issue." Much of the article focuses on the Igbo (known also as Ibo), a people who live in the Niger Delta and "are well known to be high academic achievers within Nigeria." In the United Kingdom, their children do as well in school as Chinese and Indian students:

The superior Igbo achievement on GCSEs is not new and has been noted in studies that came before the recent media discovery of African performance. A 2007 report on "case study" model schools in Lambeth also included a rare disclosure of specified Igbo performance (recorded as Ibo in the table below) and it confirms that Igbos have been performing exceptionally well for a long time (5 + A*-C GCSEs); in fact, it is difficult to find a time when they ever performed below British whites. (Chisala, 2015)

The Igbo have long been known as achievers, particularly in business. Whereas trade is largely women's work in the rest of West Africa, it is dominated by Igbo of both sexes in Nigeria.

[...] In study after study, it has been documented that the Ibo, through conflict and mobility, have been very successful in enterprise. Indeed, a major study argued that the Ibo have a very high need for achievement in the business world. Still another study showed that the majority of entrepreneurs in the sample were Ibo.(Butler, 1997, p. 178)

Sabino and Hall (1999) describe them as being “competitive, individualistic, status-conscious, antiauthoritarian, pragmatic, and practical—a people with a strongly developed commercial sense.” In colonial-era literature, they were often called "the Jews of West Africa" (see note).

Prehistory

How did the Igbo become so entrepreneurial? It's possible that their location in the Niger Delta predisposed them to be go-betweens in trade between coastal and interior peoples. Similar assemblages of glass beads, many of Egyptian origin and dating to the 9th and 14th centuries, have been recovered from the Niger Delta and eastern Mali, indicating that the Niger acted as a conduit of trade from the Atlantic coast to the Sahel and thence to the Middle East (Davison,1972; Insoll and Shaw, 1997).

Archaeological sites in the Niger Delta show that advanced economic development began much earlier there than elsewhere in West Africa. This is seen in early use of metallurgy. At one metallurgical complex, dated to 765 BC, iron ore was smelted in furnaces measuring a meter wide. The molten slag was drained through conduits to pits, where it formed blocks weighing up to 43-47 kg. The operating temperatures are estimated to have varied between 1,155 and 1,450 degrees C (Holl, 2009). Some radiocarbon dates for iron smelting in this region go back to 2000 BC (Eze-Uzomaka, 2009).

This production seems to have been in excess of local needs and therefore driven by trade with other peoples:

One aspect which can be inferred from the cylindrical slag blocks left behind is that the Lejja smelters must have had excess production of iron, and this may have led to extensive trade to far and distant places, sustained over a long period of time. (Eze-Uzomaka, 2009)

This metallurgy is unusual not only in its early date for West Africa but also in its subsequent development, which reached a high level of sophistication despite a lack of borrowing from metallurgical traditions in the Middle East and Europe. This may be seen in more than 700 artefacts of bronze, copper, and iron recovered from the Igbo-Ukwu site and dated to the 9th century AD:

They are the oldest bronze artifacts known in West African and were manufactured centuries before the emergence of other known bronze producing centers such as those of Ife and Benin. The bronzes include numerous ritual vessels, pendants, crowns, breastplates, staff ornaments, swords, and fly-whisk handles.

The Igbo-Ukwu bronzes amazed the world with a very high level of technical and artistic proficiency and sophistication which was at this time distinctly more advanced than bronze casting in Europe.

[...] Apparently the metal workers of ancient Igbo-Ukwu were not aware of commonly used techniques such as wire making, soldering or riveting which suggests an independent development and long isolation of their metal working tradition.

[...] Some of the techniques used by the ancient smiths are not known to have been used outside Igbo-Ukwu such as the production of complex objects in stages with the different parts later fixed together by brazing or by casting linking sections to join them. (Wikipedia, 2015) 

Contact with European traders

Thus, even before the first European contacts in the 16th century, the Igbo were already the focus of a network of trading relationships that extended outward from the Niger Delta. European traders became integrated into this trade network, thereby enabling the Igbo to emerge as valued middlemen in the slave trade: 

The peoples of south-eastern Nigeria have been involved in trade for as long as there are any records. The archaeological sites at Igbo-Ukwu and other evidence reveal long distance trade in metal and beads, as well as regional trade in salt, cloth, and beads at an early date. The lower Niger River and its Delta featured prominently in this early trade, and evidence is offered to suggest a continuity in the basic modes of trade on the lower Niger from c. A.D. 1500 to the mid-nineteenth century. An attempt to sketch the basic economic institutions of the Igbo hinterland before the height of the slave trade stresses regional trading networks in salt, cloth, and metal, the use of currencies, and a nexus of religious and economic institutions and persons. It is argued that while the growth of the slave trade appears to have been handled without major changes in the overall patterns of trade along the lower Niger, in the Igbo hinterland a new marketing 'grid', dominated by the Arochuku traders, was created using the pre-existent regional trading networks and religious values as a base. (Northrop, 1972)

British colonial rule

Great Britain took over Nigeria initially as part of its effort to outlaw the slave trade. Lagos was annexed in 1861 and a sphere of influence over the country was recognized in 1885 at the Berlin Conference, although a protectorate would not be proclaimed until 1901.

This new political environment favored the Igbo, whose initiative, self-discipline, and future orientation predisposed them to succeed not only in their homeland but also elsewhere in Nigeria, where they soon became dominant as merchants and civil servants. They thus took on a role like that of middleman minorities elsewhere in the empire, such as the Parsis in western India, the Chinese in Malaya, and the South Asians in East Africa. By the 1930s, one Igbo boasted that "the Ibo domination of Nigeria is a matter of time" (Ibrahim, 2000, p. 56). This trend even affected the army. By independence, 24 of the 52 senior army officers of the rank of major and above were Igbos (Ibrahim, 2000,p. 55).

This dominance led to jealousy among Nigerians in the north and west, who accused the Igbo of unfair business practices:

In the private sector they [the Hausa Muslims] are open to the exploitation of the Ibo control of the modern sector of private business activities. Ibos fix prices unilaterally by which Hausa money is siphoned daily. The Hausa are reduced to utter poverty and a large percentage of them rendered street beggars. (quoted in Ibrahim, 2000, p. 52)

According to Arthur Nwankwo (1985:9) "Nigerians of all other ethnic groups will probably achieve consensus on no other matter than their common resentment of the Igbo", a phenomenon that Chinua Achebe had dubbed "the Igbo problem". They argue that the Igbos are more cosmopolitan, more adapted to other cultures, more individualistic and competitive, more receptive to change and more prone to settle and work in other parts of the country but the myth persists that they are aggressive, arrogant and clannish. (Ibrahim, 2000, p.55)

Independence, civil war, and the aftermath

Independence came to Nigeria in 1960, and with it growing disillusionment among many Igbo, particularly with the perceived instability and corruption of the political process. In 1966, Igbo officers staged a coup and seized control of the country, killing the prime minister and the premiers of the northern and western regions. Northern army officers then staged a countercoup, and Igbo began to flee northern cities in the wake of persecution.

The next year, in 1967, the Igbo seceded and formed their own country, the Republic of Biafra. They lost the ensuing civil war at the cost of a million civilian deaths and a devastated homeland. Nonetheless, they are today building on "the remarkable Igbo economic and commercial élan that has occurred since the end of the civil war" (Ibrahim, 2000, p. 56).

Yet mistrust remains: "the North and the West have a deep-seated mistrust of the Igbo and so are bent on restricting, containing, and denying the Igbo their political right. Added to this is their subtle message to other minority groups: the Igbo, as a group, are not to be trusted!" (Abidde, 2004). This mistrust is founded on a not unjustified perception that the Igbo will prevail on any level playing field:

Collectively, the Igbo are wealthy, educated, and intelligent. These are people with global influence, strength of character, élan and self confidence. The Igbo nation has attributes most other Nigerian nations can only dream of; and are what most other nations are not. The Igbo made and makes Nigeria better. Any wonder then that the Igbo can do without Nigeria; but Nigeria and her myriad nationalities cannot do without the Igbo? Take the Igbo out of the Nigeria equation, and Nigeria will be a wobbling giant gasping for air! (Abidde, 2004)

Today, there is growing recognition in Nigeria that the Igbo can and should be given more political and economic power, but there is still a fear that they will use such power selfishly and not for the good of all Nigerians.

Conclusion

Chanda Chisala uses the Igbo example to refute the "hereditarian-HBD" argument. In doing so, he comes closer to the HBD position than he may realize. Recent work on gene-culture coevolution has shown that the average mental makeup of human populations can change significantly over a short span of historical time. This notably seems to have happened with the Ashkenazi Jews and the English between the Middle Ages and the 19th century (Clark et al., 2007; Cochran et al., 2006).

Why couldn't a similar process have happened with the Igbo? Why assume that sub-Saharan Africa is a monolith whose diverse populations have evolved in exactly the same way? We know that human genetic evolution didn't slow down with the coming of culture. It actually sped up (Hawks et al., 2007). For the most part, we humans have diversified genetically in response to differences in cultural environment and not to differences in natural environment. It is therefore plausible that the different cultures of Africa have had different effects on the gene pools of their respective populations.

I can hear the answer to my question: "You guys are the ones who think all blacks are alike!" Well, that isn't what I think.

On a final note, I couldn't help noticing the many commenters who complimented Chanda on sticking it to the HBD crowd. Don't they understand the logical contraposition? If it can be shown that some African groups have higher cognitive ability, doesn't the converse become plausible and even expectable?

Note

It may be that a similar sort of nickname had evolved into the word "Igbo" itself: "[...] some Ibo claim that the word "Hebrew" must have been mutilated to "Ubru" or "Ibru," then to "Uburu," and later to "Ibo."" (Butler, 1997, pp. 177-178). This is plausible, given that the Igbo initially had a weak sense of collective identity and may not have had a native name for themselves, thus inclining them to take a name given by outsiders. There are examples of this sort of thing elsewhere in Africa. The Tukulor of Senegal, for instance, were originally called the "two colors" by European travellers because some of them were light-skinned and others dark-skinned. 

References 

Abidde, S.O. (2004). The Nigerian Presidency and the Igbo Nation, Gamji
http://www.gamji.com/article3000/NEWS3755.htm 

Butler, J.S. (1997). Why Booker T. Washington was right. A reconsideration of the economics of race," in T.D. Boston (ed.) A Different Vision: African American economic thought, Volume 1, (pp. 174-193), Psychology Press
https://books.google.ca/books?id=WMwRwp9QImAC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 
Chisala, C. (2015). The IQ gap is no longer a black and white issue, The Unz Review, June 25
http://www.unz.com/article/the-iq-gap-is-no-longer-a-black-and-white-issue/

Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World, Princeton University Press, Princeton and Oxford. 

Cochran, G., J. Hardy, and H. Harpending. (2006). Natural history of Ashkenazi intelligence, Journal of Biosocial Science, 38, 659-693.
http://harpending.humanevo.utah.edu/trial.link/Ashkenazi.pdf

Davison, C.C. (1972). Glass beads in African archaeology: Results of neutron activation analysis, supplemented by results of X-ray fluorescence analysis, Lawrence Berkeley Laboratory, University of California


Eze-Uzomaka, P. (2009). Iron and its influence on the prehistoric site of Lejja, World of Iron Conference
https://www.academia.edu/4103707/Iron_and_its_influence_on_the_prehistoric_site_of_Lejja 

Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution, Proceedings of the National Academy of Sciences U.S.A., 104, 20753-20758.
http://www.researchgate.net/publication/5761823_Recent_acceleration_of_human_adaptive_evolution/file/9c9605240c4bb57b55.pdf 

Holl, A. F.C. (2009). Early West African Metallurgies: New Data and Old Orthodoxy, Journal of World Prehistory, 22, 415-438
http://www.researchgate.net/profile/Augustin_Holl/publication/226180393_Early_West_African_Metallurgies_New_Data_and_Old_Orthodoxy/links/00b7d52d503ee77a4c000000.pdf 
Ibrahim, J. (2000). The transformation of ethno-regional identities in Nigeria, in A. Jegga (ed.) Identity Transformation and Identity Politics Under Structural Adjustment in Nigeria, (pp. 41-61), Nordic Africa Institute.
https://books.google.ca/books?id=fUWLQv8-H70C&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 

Insoll, T. and T. Shaw. (1997). Gao and Igbo-Ukwu: Beads, interregional trade, and beyond, African Archaeological Review, 14, 9-23
http://link.springer.com/article/10.1007/BF02968364 

Northrup, D. (1972). The growth of trade among the Igbo before 1880, The Journal of African History, 13,217-236.
http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=3234276&fileId=S0021853700011439

Sabino, R. and J. Hall. (1999). The path not taken: Cultural identity in the interesting life of Olaudah Equiano, MELUS, 24, 5-19. 

Wikipedia. (2015). Archaeology of Igbo-Ukwu
https://en.wikipedia.org/wiki/Archaeology_of_Igbo-Ukwu

Sometimes the consensus is phony

$
0
0

Migrants arriving on the island of Lampedusa (Wikicommons). The NATO-led invasion of Libya has opened a huge breach in Europe's defences.

 

A synthesis has been forming in the field of human biodiversity. It may be summarized as follows: 

1. Human evolution did not end in the Pleistocene or even slow down. In fact, it speeded up with the advent of agriculture 10,000 years ago, when the pace of genetic change rose over a hundred-fold. Humans were no longer adapting to relatively static natural environments but rather to faster-changing cultural environments of their own making. Our ancestors thus directed their own evolution. They created new ways of life, which in turn influenced who would survive and who wouldn't.

2. When life or death depends on your ability to follow a certain way of life, you are necessarily being selected for certain heritable characteristics. Some of these are dietary—an ability to digest milk or certain foods. Others, however, are mental and behavioral, things like aptitudes, personality type, and behavioral predispositions. This is because a way of life involves thinking and behaving in specific ways. Keep in mind, too, that most mental and behavioral traits have moderate to high heritability.

3. This gene-culture co-evolution began when humans had already spread over the whole world, from the equator to the arctic. So it followed trajectories that differed from one geographic population to another. Even when these populations had to adapt to similar ways of life, they may have done so differently, thus opening up (or closing off) different possibilities for further gene-culture co-evolution. Therefore, on theoretical grounds alone, human populations should differ in the genetic adaptations they have acquired. The differences should generally be small and statistical, being noticeable only when one compares large numbers of individuals. Nonetheless, even small differences, when added up over many individuals and many generations, can greatly influence the way a society grows and develops.

4. Humans have thus altered their environment via culture, and this man-made environment has altered humans via natural selection. This is probably the farthest we can go in formulating a unified theory of human biodiversity. For Gregory Clark, the key factor was the rise of settled, pacified societies, where people could get ahead through work and trade, rather than through violence and plunder. For Henry Harpending and Greg Cochrane, it was the advent of agriculture and, later, civilization. For J. Philippe Rushton and Ed Miller, it was the entry of humans into cold northern environments, which increased selection for more parental investment, slower life history, and higher cognitive performance. Each of these authors has identified part of the big picture, but the picture itself is too big to reduce to a single factor.

5. Antiracist scholars have argued against the significance of human biodiversity, but their arguments typically reflect a lack of evolutionary thinking. Yes, human populations are open to gene flow and are thus not sharply defined (if they were, they would be species). It doesn't follow, however, that the only legitimate objects of study are sharply defined ones. Few things in this world would pass that test.

Yes, genes vary much more within human populations than between them, but these two kinds of genetic variation are not comparable. A population boundary typically coincides with a geographic or ecological barrier, such as a change from one vegetation zone to another or, in humans, a change from one way of life to another. It thus separates not only different populations but also differing pressures of natural selection. This is why genetic variation within a population differs qualitatively from genetic variation between populations. The first kind cannot be ironed out by similar selection pressures and thus tends to involve genes of little or no selective value. The second kind occurs across population boundaries, which tend to separate different ecosystems, different vegetation zones, different ways of life ... and different selection pressures. So the genes matter a lot more.

This isn't just theory. We see the same genetic overlap between many sibling species that are nonetheless distinct anatomically and behaviorally. Because such species have arisen over a relatively short span of time, like human populations, they have been made different primarily by natural selection, so the genetic differences between them are more likely to have adaptive, functional consequences ... as opposed to "junk variability" that slowly accumulates over time.

Why is the above so controversial?

The above synthesis should not be controversial. Yet it is. In fact, it scarcely resembles acceptable thinking within academia and even less so within society at large. There are two main reasons.

The war on racism 

In the debate over nature versus nurture, the weight of opinion shifted toward the latter during the 20th century. This shift began during the mid-1910s and was initially a reaction against the extreme claims being made for genetic determinism. In reading the literature of the time, one is struck by the restraint of early proponents of environmental determinism, especially when they argue against race differences in mental makeup. An example appears in The Clash of Colour (1925), whose author condemned America's Jim Crow laws and the hypocrisy of proclaiming the rights of Europeans to self-determination while ignoring those of Africans and Asians. Nonetheless, like the young Franz Boas, he was reluctant to deny the existence of mental differences:

I would submit the principle that, although differences of racial mental qualities are relatively small, so small as to be indistinguishable with certainty in individuals, they are yet of great importance for the life of nations, because they exert throughout many generations a constant bias upon the development of their culture and their institutions. (Mathews, 1925, p. 151)

That was enlightened thinking in the 1920s. The early 1930s brought a radical turn with Hitler's arrival to power and a growing sense of urgency that led many Jewish and non-Jewish scholars to declare war on "racism." The word itself was initially a synonym for Nazism, and even today Nazi Germany still holds a central place in antiracist discourse.

Why didn't the war on racism end when the Second World War ended? For one thing, many people, feared a third global conflict in which anti-Semitism would play a dominant role. For another, antiracism took on a life of its own during the Cold War, when the two superpowers were vying for influence over the emerging countries of Asia and Africa.

Globalism

The end of the Cold War might have brought an end to the war on racism, or at least a winding down, had it not replaced socialism with an even more radical project: globalism. This is the hallmark of "late capitalism," a stage of historical development when the elites no longer feel restrained by national identity and are thus freer to enrich themselves at their host society's expense, mainly by outsourcing jobs to low-wage countries and by insourcing low-wage labor for jobs that cannot be relocated, such as those in construction and services. That's globalism in a nutshell.

This two-way movement redistributes wealth from owners of labor to owners of capital. Businesses get not only a cheaper workforce but also weaker labor and environmental standards. To stay competitive, workers in high-wage countries have to accept lower pay and a return to working conditions of another age. The top 10% are thus pulling farther and farther ahead of everyone else throughout the developed world. They're getting richer ... not by making a better product but by making the same product with cheaper and less troublesome inputs of labor. This is not a win-win situation, and the potential for revolutionary unrest is high.

To stave off unrest, economic systems require legitimacy, and legitimacy is made possible by ideology: a vision of a better future; how we can get there from here; and why we're not getting there despite the best efforts. Economic systems don't create ideology, but they do create conditions that favor some ideologies over others. With the collapse of the old left in the late 1980s, and the rise of market globalization, antiracism found a new purpose ... as a source of legitimacy for the globalist project.

I saw this up close in an antiracist organization during the mid to late 1980s. Truth be told, we mostly did things like marching in the May Day parade, agitating for a higher minimum wage, denouncing the U.S. intervention in Panama, organizing talks about Salvador Allende and what went wrong in Chile ... you get the drift. Antiracism was subservient to the political left. This was not a natural state of affairs, since the antiracist movement—like the Left in general—is a coalition of ethnic/religious factions that prefer to pursue their own narrow interests. This weakness was known to the political right, many of whom tried to exploit it by supporting Muslim fundamentalists in Afghanistan and elsewhere and black nationalists in Africa, Haiti, and the U.S. Yes, politics makes strange bedfellows.

With the onset of the 1990s, no one seemed to believe in socialism anymore and we wanted to tap into corporate sources of funding. So we reoriented. Leftist rhetoric was out and slick marketing in. Our educational materials looked glossier but now featured crude "Archie Bunker" caricatures of working people, and the language seemed increasingly anti-white. I remember feeling upset, even angry. So I left.

Looking back, I realize things had to happen that way. With the disintegration of the old socialist left, antiracists were freer to follow their natural inclinations, first by replacing class politics with identity politics, and second by making common cause with the political right, especially for the project of creating a globalized economy. Antiracism became a means to a new end.

This is the context that now frames the war on racism. For people in a position to influence public policy, antiracism is not only a moral imperative but also an economic one. It makes the difference between a sluggish return on investment of only 2 to 3% (which is typical in a mature economy) and a much higher one.

What to do?

Normally, I would advise caution. People need time to change their minds, especially on a topic as emotional as this one. When tempers flare, it's usually better to let the matter drop and return later. That's not cowardice; it's just a recognition of human limitations. Also, the other side may prove to be right. So, in a normal world, debate should run its course, and the policy implications discussed only when almost everyone has been convinced one way or the other.

Unfortunately, our world is far from normal. A lot of money is being spent to push a phony political consensus against any controls on immigration. This isn't being done in the dark by a few conspirators. It's being done in the full light of day by all kinds of people: agribusiness, Tyson Foods, Mark Zuckerberg, the U.S. Chamber of Commerce, and small-time operations ranging from landscapers to fast-food joints. They all want cheaper labor because they're competing against others who likewise want cheaper labor. It's that simple ... and stupid.

This phony consensus is also being pushed at a time when the demographic cauldron of the Third World is boiling over. This is particularly so in sub-Saharan Africa, where the decline in fertility has stalled and actually reversed in some countries. The resulting population overflow is now following the path of least resistance—northward, especially with the chaos due to the NATO-led invasion of Libya. In the current context, immigration controls should be strengthened, and yet there is lobbying to make them even weaker. The idiocy is beyond belief.

For these reasons, we cannot wait until even the most hardboiled skeptics are convinced. We must act now to bring anti-globalist parties to power: the UKIP in Britain, the Front national in France, the Partij voor de Vrijheid in the Netherlands, the Alternative für Deutschland in Germany, and the Sverigedemokraterna in Sweden. How, you may ask? It's not too complicated. Just go into the voting booth and vote. You don't even have to talk about your dirty deed afterwards. 

It looks like such parties will emerge in Canada and the United States only when people have seen what can be done in Europe. Until then, the tail must wag the dog. We in North America can nonetheless prepare the way by learning to speak up and stand up, and by recognizing that the "Right" is just as problematic as the "Left."
 

References


Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World, Princeton University Press, Princeton and Oxford. 

Clark, G. (2009a). The indicted and the wealthy: surnames, reproductive success, genetic selection and social class in pre-industrial England,
http://www.econ.ucdavis.edu/faculty/gclark/Farewell%20to%20Alms/Clark%20-Surnames.pdf 

Clark, G. (2009b). The domestication of Man: The social implications of Darwin, ArtefaCTos, 2(1), 64-80.
http://campus.usal.es/~revistas_trabajo/index.php/artefactos/article/viewFile/5427/5465 

Clark, G. (2010). Regression to mediocrity? Surnames and social mobility in England, 1200-2009
http://www.econ.ucdavis.edu/faculty/gclark/papers/Ruling%20Class%20-%20EJS%20version.pdf

Cochran, G. and H. Harpending. (2010). The 10,000 Year Explosion: How Civilization Accelerated Human Evolution, New York: Basic Books. 

Frost, P. (2011a). Human nature or human natures? Futures, 43, 740-748.
http://www.researchgate.net/publication/251725125_Human_nature_or_human_natures/file/504635223eaf8196f0.pdf  

Frost, P. (2011b). Rethinking intelligence and human geographic variation, Evo and Proud, February 11
http://evoandproud.blogspot.ca/2011/02/rethinking-intelligence-and-human.html 

Harpending, H., and G. Cochran. (2002). In our genes, Proceedings of the National Academy of Sciences U.S.A., 99, 10-12.
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC117504/  

Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution, Proceedings of the National Academy of Sciences U.S.A., 104, 20753-20758.
http://www.researchgate.net/publication/5761823_Recent_acceleration_of_human_adaptive_evolution/file/9c9605240c4bb57b55.pdf

Mathews, B. (1925). The Clash of Colour. A Study in the Problem of Race, London: Edinburgh House Press. 

Miller, E. (1994). Paternal provisioning versus mate seeking in human populations, Personality and Individual Differences, 17, 227-255.
http://www.prometheism.net/paternal/  

Rushton, J. P. (2000). Race, Evolution, and Behavior, 3rd ed., Charles Darwin Research Institute.
http://lazypawn.com/wordpress/wp-content/uploads/Race_Evolution_Behavior.pdf

The puzzle of European hair, eye, and skin color

$
0
0

 
Mary Magdalene, Frederick Sandys (1829-1904). Is the physical appearance of Europeans solely or even mainly an adaptation to climate?
 

The Russian online magazine Kultura VRNhas published an article I wrote on the "puzzle of European hair, eye, and skin color." The following is the original English text.

 

Most humans have black hair, brown eyes, and brown skin. Europeans are different: their hair is also brown, flaxen, golden, or red, their eyes also blue, gray, hazel, or green, and their skin pale, almost like an albino's. This is particularly the case in northern and eastern Europeans. 

How did this color scheme come about? Perhaps the same genes that lighten skin pigmentation also affect hair and eye pigmentation. Yet the genes are different in each case. Our skin became white mainly through the replacement of one allele by another at three separate genes. Our hair acquired a diverse palette of colors through a proliferation of new alleles at another gene. Our eyes acquired a similar palette through similar changes at yet another gene.

This color scheme is puzzling in another way: it is stronger in women than in men. Women are naturally more variable than men in hair color, redheads in particular being more common. They are likewise more variable in eye color in those populations where blue eyes are common. Finally, throughout the world, women are fairer-skinned than men, as a result of cutaneous changes at puberty.

While women are more diversely colored in their hair and eyes, this greater diversity has a different cause in each case. In the case of hair color, women have more of the intermediate hues because the darkest hue (black) is less easily expressed. In the case of eye color, women have more of the intermediate hues because the lightest hue (blue) is less easily expressed.

If hair color and eye color diversified in ways that differ physiologically but are similar visually, then the common purpose of this diversity must be visual. Furthermore, in both cases, this diversity concerns visible features on or near the face—the focus of visual attention.

Sexual selection?

Why would a facial feature become more colorful in one sex than in the other? The likeliest reason is sexual selection, which occurs when one sex has to compete for the attention of the other. This kind of selection favors eye-catching colors that are either bright or novel.

Bright colors stay in memory longer. If we look at the hair and eye colors that arose in Europe, we see that they are brighter than the human norm of black hair and brown eyes. Hair is carrot red but not beet red. Eyes are sky blue but not navy blue.

Novel colors hold attention longer. Attraction to novelty may explain how the European palette of hair and eye colors came into being. First, a new color would appear by mutation and be initially rare and novel. Second, its novelty would attract attention and increase one's chances of mating, with the result that the color would become more common in succeeding generations. Third, attention would now shift toward rarer and more novel colors that had recently appeared by mutation. All in all, it was this fascination with novelty that caused the number of hair and eye colors to increase steadily over time, once sexual selection had become strong enough.

This novelty effect appears in a study on male preferences for female hair color. Men were shown a series of photos of attractive blondes and brunettes, and they were asked to choose the one they most wanted to marry. It turned out that the scarcer the brunettes were in the series, the likelier any one brunette would be chosen. Another study likewise found that Maxim cover girls are much more often light blonde or dark brown than the usual dark blonde or light brown of real life.

A preference will become a choice only if one has a choice. This is the principle of sexual selection: one sex is in a better position to choose than the other. In most mammalian species, females are in a better position because they can choose among a larger number of males on the mate market. This is because the latter are almost always available for mating, whereas females are unavailable during pregnancy and the period of infant care. Males thus tend to be polygamous.

In early human societies that lived from hunting and gathering, the incidence of polygamy varied with latitude. It was highest in the tropics, where a woman could gather food year-round and feed herself and her children with little male assistance. This self-reliance made it easier for her mate to look for another woman.

Beyond the tropics, women were less self-reliant, particularly during winter when they could no longer gather food and depended on meat from their spouses. This dependence increased with longer winters at higher latitudes. In the Arctic, only a very able hunter could support a second wife. 

Higher latitudes meant not just fewer men on the mate market but also fewer men altogether. Because women could not supply as much food and because the land supported less wildlife, men had to hunt for a longer time over longer distances, with the result that more of them died from falls, drowning, starvation, and cold. Women thus faced a competitive mate market and strong sexual selection. This was especially so on the continental steppe-tundra of the sub-Arctic, where almost all food came from long-distance hunting. 

During the last ice age, this steppe-tundra covered more territory, stretching from the plains of Europe to Alaska. But it was continuously inhabited only at its western end. The climate was milder there because the Scandinavian icecap had pushed the steppe-tundra to the south and because the Atlantic Ocean provided warmth and moisture. These conditions favored a lush growth of grasses, mosses, lichens, and low shrubs, which supported large herds of herbivores and, in turn, a large human population. The climate was less favorable east of the Urals, in Asia, where the steppe-tundra was colder and drier because it was located farther north and farther from the Atlantic's moderating influence. As a result, the human population was smaller and more vulnerable to extinction, particularly during the glacial maximum.

In sum, the European steppe-tundra was a singularity among the many environments that confronted early humans as they spread around the world. Food was abundant but accessible only to males of hunting age, whose ranks were thinned by hunting deaths. A surplus of single women developed, partly because men were fewer in number and partly because men could not easily bear the cost of providing for a second wife and her children. Women thus had to compete against each other for a smaller number of potential mates, the result being strong sexual selection for those women with eye-catching characteristics.

Ancient DNA

Today, this is the same region where the skin is whitest and the hair and eyes most diversely colored. Here, too, the earliest evidence of this color scheme has been found in ancient DNA from human remains. Initially, it was thought that blue eyes arose among the hunter-gatherers of the Mesolithic and white skin among the farmers of the Neolithic. This view has been challenged by genetic evidence of white skin, red hair, blonde hair, and blue eyes in the remains of Mesolithic hunter-gatherers from Scandinavia and Russia. It seems that some people already had the European color scheme at that early date, but only in the north and east of Europe.

But when exactly did this color scheme develop? Probably earlier still—sometime between the earliest Mesolithic evidence (8,000 years ago) and Kostenki Man (circa 37,000 years ago), who still had dark skin, dark eyes, and an African facial shape. As we retrieve more ancient DNA, we may narrow this timeframe, perhaps to the last ice age (circa 10,000 to 25,000 years ago) when steppe-tundra covered the plains of northern and eastern Europe ... and where men were a critical resource in limited supply.

That is a big change over a short time. If sexual selection had not been the cause, what else could have been? The need to adapt to weaker sunlight and a colder climate? Why, then, did this evolution not happen among indigenous peoples who live just as far north in Asia and North America? In any case, why would a northern climate favor a proliferation of new hair and eye colors?

Future research

We cannot go back in time to see why early Europeans changed so fast and so radically. But we can question "witnesses" from that time. As we have seen, one witness is ancient DNA, and this research is ongoing.

Another witness is sex linkage. If sexual selection had acted on early European women, it should have directly modified their physical appearance. Since most genes have little or no sex linkage, this selection would have also indirectly modified the appearance of early European men. But there should still be some signs of sex linkage. We know, for example, that blue eyes are associated with a more feminine face shape. Other examples probably remain to be found.

Finally, there is the witness of culture. Single women, typically virgins, hold an unusual importance in the myths, folklore, and traditions of Europe. In this, we may see an echo of a time when many women never married and became oriented toward communal tasks, such as tending camp fires or acting as seers, sibyls, oracles, and the like. That period of prehistory may have influenced the subsequent course of cultural evolution, thereby giving women a greater role in society at large than they otherwise would have.

Reference

Frost, P. (2015).Загадка цвета кожи, волос и глаз у европейцев, Куьтура ВРН, July 7, translated by Dr. Yuri Lozotsev.
http://culturavrn.ru/world/15780

Not everyone does it

$
0
0

 
Un homme et une femme, 1891, Stephan Sinding (1846-1922). Almost as fun as sex.

 

All humans love to kiss, so kissing must go back to early hominids and even chimps and bonobos. This is how ethologists and evolutionary psychologists think when they write about the subject.

Just one thing. Even in historic times not all humans loved to kiss. Far from arising millions of years in the past, kissing seems to have arisen no earlier than 40,000 years ago, when modern humans began to enter northern Eurasia.

So concludes a recent cross-cultural study:

We found only 77 out of 168 (46%) cultures in which the romantic-sexual kiss was present. Significantly, no ethnographer working with Sub-Saharan African, New Guinea, or Amazonian foragers or horticulturalists reported having witnessed any occasion in which their study populations engaged in a romantic-sexual kiss. However, kissing appears to be nearly ubiquitous among 9 of the 11 foragers living in Circum-Arctic region (i.e., northern Asia and North America). The concentration of kissing among Circum-Arctic foragers, for which we do not have a satisfactory explanation other than invoking cultural diffusion, stands in stark contrast to its equally striking absence among foragers in other cultural regions. (Jankowiak et al., 2015)

This is not the first study to deny the universality of kissing, although scholars have tended to place its origin in the civilizations of the Mediterranean, the Middle East, and South Asia (Hawley, 2007; Hopkins, 1907). The English sexologist Havelock Ellis argued that kissing began with "civilized man": 

It is only under a comparatively high stage of civilization that the kiss has been emphasized and developed in the art of love. Thus the Arabic author of the Perfumed Garden, a work revealing the existence of a high degree of social refinement, insists on the great importance of the kiss, especially if applied to the inner part of the mouth, and he quotes a proverb that "A moist kiss is better than a hasty coitus." Such kisses, as well as on the face generally, and all over the body, are frequently referred to by Hindu, Latin, and more modern erotic writers as among the most efficacious methods of arousing love. (Ellis,1897-1928)

It may be that kissing originated in prehistory among the hunter-gatherers of northern Eurasia and then spread south, where it reached its full flowering in a milieu that idealized it in prose, poetry, and painting. A kind of positive feedback thus developed between the practice and the ideal.

Then, at a later date, it became less common in northern Europe because of the moral climate that followed the Reformation, having been previously very common. When the Greek scholar Demetrios Chalkokondyles (1423-1511) visited England, he was surprised by its ubiquity:

As for English females and children, their customs are liberal in the extreme. For instance, when a visitor calls at a friend's house, his first act is to kiss his friend's wife; he is then a duly-installed guest. Persons meeting in the street follow the same custom, and no one sees anything improper in the action.(Bombaugh, 1876, p. 33)

Another Greek traveler likewise remarked a century later:

The English manifest much simplicity and lack of jealousy in their customs as regards females; for not only do members of the same family and household kiss them on the lips with complimentary salutations and enfolding of the arms around the waist, but even strangers, when introduced, follow the same mode, and it is one which does not appear to them in any degree unbecoming.(Bombaugh, 1876, p. 33)

Similar comments were made by Erasmus (1467-1536):

If you go to any place, you are received with a kiss by all; if you depart on a journey, you are dismissed with a kiss; if you return, the kisses are exchanged. Do they come to visit you, a kiss is the first thing; do they leave you, you kiss them all around. Do they meet you anywhere, kisses in abundance. In short, wherever you turn, there is nothing but kisses. Ah, Faustus, if you had once tasted the tenderness, the fragrance of these kisses, you would wish to stay in England, not for ten years only, but for life. (Bombaugh, 1876, p. 34)

Kissing then fell into decline among the English; so much so, that frequent public displays became seen as a continental thing. Nonetheless, it remained much more common than in other parts of the world, particularly East Asia and sub-Saharan Africa. This difference amused travelers as late as the 19th century:

An American naval officer, who had spent considerable time in China, narrates an amusing experience of the ignorance of the Chinese maidens of the custom of kissing. Wishing to complete a conquest he had made of a young mei jin (beautiful lady), he invited her—using the English words—to give him a kiss. Finding her incomprehension of his request somewhat obscure, he suited the action to the word and took a delicious kiss. The girl ran away into another room, thoroughly alarmed, exclaiming, "Terrible man-eater, I shall be devoured." (Bombaugh, 1876, p. 80)

For the same time period, Havelock Ellis noted: "Kisses, and embraces are simply unknown in Japan as tokens of affection" with the exception of mothers hugging and kissing their infants. Similarly, "among nearly all of the black races of Africa lovers never kiss nor do mothers usually kiss their babies." He then went on to argue that the romantic kiss evolved out of this maternal kissing, which seems more or less universal.

With the globalization of culture through movies, magazines, and other media, kissing has been spread to the four corners of the earth. Clearly, we can all do it and enjoy doing it to some extent. But I don't think we all share the same urge to do it.

Gene-culture coevolution?

Don't laugh. Even religiosity is partly genetic, so why not the desire to kiss and be kissed? What little we know about the subject comes from studies of compulsive kissing syndrome, where a lesion to the right temporal lobe (associated with epilepsy or glioma) causes an uncontrollable urge to kiss anyone independently of sexual interest (Mendez and Shapira, 2014; Mikata et al., 2005; Ozkara et al., 2004). This compulsion differs from other disorders where increased kissing results from loss of sexual inhibition and is targeted at sexually desirable individuals. The brain may thus have a pre-formed circuit that triggers the desire to kiss. In short, kissing is not solely learned. It has an innate component.

At first, this innate component would have been the same in all humans, back when kissing mainly happened between a mother and her infants. It then became more sexual and more important among the hunter-gatherers of northern Eurasia. Later still, in Europe and the Middle East, it developed into a second channel of sexual arousal almost on a par with the sex act itself.

As Havelock Ellis observed:

[...] there is certainly no such channel for directing nervous force into the sexual sphere as the kiss. This is nowhere so well recognized as in France, where a young girl's lips are religiously kept for her lover, to such an extent, indeed, that young girls sometimes come to believe that the whole physical side of love is comprehended in a kiss on the mouth; so highly intelligent a woman as Madam Adam has described the agony she felt as a girl when kissed on the lips by a man, owing to the conviction that she had thereby lost her virtue.

Sexual kissing initially arose through people pushing the envelope of phenotypic plasticity. This envelope in turn became part of the environment that people had to fit into. Those who couldn't, or wouldn't, were at a disadvantage and were bit by bit pushed out of the gene pool. Those who could, and would, took their place. New genetic variants thus arose and flourished, some to strengthen the new behavior and others to make it more pleasurable.

In this, and in many other ways, Man has created Man. We humans have shaped our environment, which in turn has shaped us, even in our genes. This point becomes clear only if one abandons the assumption, so dear to evolutionary psychology, that we stopped evolving back in the Pleistocene. We didn't. In fact, most of the interesting stuff has come about since then.

References 

Bombaugh, C.C. (1876). The Literature of Kissing, gleaned from history, poetry, fiction, and anecdote, Philadelphia: J.B. Lippincott & Co.
https://books.google.ca/books?id=p9lPAAAAYAAJ&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false  

Ellis, H. (1897-1928). Studies in the Psychology of Sex, vol. IV, Appendix A. The origins of the kiss.
https://www.gutenberg.org/files/13613/13613-h/13613-h.htm 

Hawley, R. (2007). 'Give me a thousand kisses': the kiss, identity, and power in Greek and Roman antiquity, Leeds International Classical Studies, 6.5
http://pdf.thepdfportal.net/?id=123399 

Hopkins, E.W. (1907). The sniff-kiss in ancient India, Journal of the American Oriental Society, 28, 120-134.
http://www.jstor.org/stable/592764?seq=1#page_scan_tab_contents 

Jankowiak, W.R., S.L. Volsche, J.R. Garcia. (2015). Is the Romantic-Sexual Kiss a Near Human Universal? American Anthropologist, early view
http://onlinelibrary.wiley.com/wol1/doi/10.1111/aman.12286/abstract 

Mendez, M.F. and J.S. Shapira. (2014). Kissing or "Osculation" in Frontotemporal Dementia, The Journal of Neuropsychiatry & Clinical Neurosciences, 26, 258-261.
http://neuro.psychiatryonline.org/doi/full/10.1176/appi.neuropsych.13060139 

Mikati, M.A., Y.G. Comair, A.N. Shamseddine. (2005). Pattern-induced partial seizures with repetitive affectionate kissing: an unusual manifestation of right temporal lobe epilepsy. Epilepsy & Behavior, 6, 447-451
http://www.sciencedirect.com/science/article/pii/S1525505005000144 

Ozkara, C., H. Sarý, L. Hanoglu, et al. (2004). Ictal kissing and religious speech in a patient with right temporal lobe epilepsy, Epileptic Disorders, 6, 241-245
http://www.jle.com/fr/revues/epd/e-docs/ictal_kissing_and_religious_speech_in_a_patient_with_right_temporal_lobe_epilepsy__265692/article.phtml?tab=download&pj_key=doc_alt_2458
Viewing all 353 articles
Browse latest View live