miércoles, 20 de febrero de 2008

February 17, 2008

Boys Will Be Boys, Girls Will Be Hounded by the Media

The New York Times

A VIDEO of Heath Ledger hanging out at a drug-fueled party two years before his death would seem to constitute must-see material for a tabloid entertainment show.

But when such a video ended up in the hands of the producers of “Entertainment Tonight,” the program declined to broadcast it, a spokeswoman said, “out of respect for Heath Ledger’s family.” The 28-year-old actor died on Jan. 22 from what the medical examiner called an accidental overdose of prescription medications.

Amy Winehouse did not merit the same discretion. Images from a video that showed her smoking what a British tabloid, The Sun, said was a pipe of crack cocaine, as well as admitting to having taken “about six” Valium, were widely disseminated in the news media around the same time.

When Owen Wilson was hospitalized in August after an apparent suicide attempt, his plight was the subject of a single US Weekly cover story. Not so Britney Spears, recently confined in a psychiatric ward, who has inspired six cover stories for the magazine during the same time span.

When Kiefer Sutherland was released from the jail in Glendale, Calif., after serving a 48-day sentence for a drunken driving conviction, the event merited little more than buried blurbs.

Contrast this to Paris Hilton’s return to jail last year after a brief release to serve the rest of a 45-day sentence for a probation violation involving alcohol-related reckless driving. The event invited a level of attention that evoked the O. J. Simpson trial. Hordes of cameras enveloped the limousine that ferried the tear-streaked heiress to jail.

Yes, women are hardly the only targets of harsh news media scrutiny — just ask Mel Gibson. But months of parallel incidents like these seem to demonstrate disparate standards of coverage. Men who fall from grace are treated with gravity and distance, while women in similar circumstances are objects of derision, titillation and black comedy.

Some celebrities and their handlers are now saying straight out that the news media have a double standard.

“Without a doubt, women get rougher treatment, less sensitive treatment, more outrageous treatment,” said Ken Sunshine, a publicist whose clients include Ben Affleck and Barbra Streisand. “I represent some pretty good-looking guys, and I complain constantly about the way they’re treated and covered. But it’s absolutely harder for the women I represent.”

Liz Rosenberg, a publicist at Warner Bros./Reprise Records who represents Madonna, among others, also thinks sexism is at work. “Do you see them following Owen Wilson morning, noon and night?” she asked.

Some editors confirm that they handle female celebrities differently. But the reason, they say, is rooted not in sexism, but in the demographics of their audience.

The readership of US Weekly, for example, is 70 percent female; for People, it’s more than 90 percent, according to the editors of these magazines.

“Almost no female magazines will put a solo male on the cover,” said Janice Min, the editor in chief of US Weekly. “You just don’t. It’s cover death. Women don’t want to read about men unless it’s through another woman: a marriage, a baby, a breakup.”

Thus, magazine coverage of Mr. Ledger’s death gave way to stories about Michelle Williams, Mr. Ledger’s former girlfriend and the mother of his daughter; US Weekly, for instance, put the headlines “A Mother’s Pain” and “My Heart is Broken” atop a four-page spread. Mary-Kate Olsen, telephoned several times by the discoverer of Mr. Ledger’s body, came in for it, too: “What Mary-Kate Knows” trumpeted In Touch Weekly.

Indeed, while one of People’s best-selling issues of the last year was its cover story on Mr. Wilson’s suicide attempt, a follow-up cover on his recovery was one of the worst sellers, said Larry Hackett, the managing editor.

Conversely, he said, the Britney Spears story continues to flourish precisely because women are fascinated by the challenges facing a young mother.

“If Britney weren’t a mother, this story wouldn’t be getting a fraction of attention it’s getting,” Mr. Hackett said. “The fact that the custody of her children is at stake is the fuel of this narrative. If she were a single woman, bombing around in her car with paparazzi following, it wouldn’t be the same.”

Others, like Roger Friedman, an entertainment reporter for FoxNews.com, said that female stars tend to make more-compelling stories because “they are more emotional and open” about their problems. Male stars, he said, tend to be “circumspect.”

Rebecca Roy, a psychotherapist in Beverly Hills, Calif., who has several clients in the entertainment industry, said that male celebrities can often wriggle out of trouble with a rakish bad-boy shrug. But, she said, the double standard can reinforce the destructive behavior of female stars, pushing them to further depths of substance abuse and erratic behavior.

Ms. Roy said that troubled male stars like Robert Downey Jr. are encouraged to move past problems to a second act in their careers, while the personal battles of women like Lindsay Lohan or the late Anna Nicole Smith are often played for maximum entertainment value.

“With men, there’s an emphasis on, ‘he had this issue, but he’s getting over it,’ ” Ms. Roy said. “But with women, it’s like they keep at it, keep at it. It’s almost like taking the wings off of a fly.”

Ms. Min acknowledged that her magazine played down its coverage of Owen Wilson and Heath Ledger. Part of the reason, she said, was that female readers tend to be sympathetic toward young men in crisis.

“With Heath Ledger, people walked on eggshells trying to strike the right tone,” Ms. Min said, adding that “public sentiment for Heath Ledger factored into our coverage.”

Edna Herrmann, a clinical psychologist in Los Angeles, said that while schadenfreude is part of the enjoyment of star travails, women especially respond to female celebrities with commonplace demons. “Misery likes company,” Dr. Herrmann said.

But some believe the power of a celebrity’s publicist has more bearing on coverage than gender. “Entertainment Tonight” reversed its plans to show the video of Mr. Ledger following protests from stars like Natalie Portman and Josh Brolin organized by ID, which represented Mr. Ledger and still represents Ms. Williams.

In some cases, celebrities may be victims of their own appetites for media attention.

“It would seem to me that no one who demanded, who expected privacy, at the get-go was denied that privacy,” said Stan Rosenfield, a publicist who represents George Clooney.

And Harvey Levin, the managing editor of the gossip Web site TMZ.com, said that female stars are afforded every opportunity to move past their sins, as long as they clean up their behavior.

Nicole Richie, who took a beating generally for being a screw-up, has turned it around, and everyone’s cheering for her now,” Mr. Levin said of the former Paris Hilton sidekick and tabloid staple, now the mother of a month-old daughter.

Even if news media coverage is weighted in their favor, male celebrities aren’t exactly feeling immune from harsh scrutiny.

“There is certainly an argument for it being incredibly sexist, the attention that’s given to women and the hounding of them,” the actor Colin Farrell said at a recent party for his new film, “In Bruges.”

Mr. Farrell, who has attracted his share of attention, said such potential bias did not make him any less of a news media target. “If they catch me out and about,” he said, “they’ll go for it.”

As Mr. Farrell spoke in a room filled with journalists and photographers, he was not even sipping a beer.

Additional reporting by Paula Schwartz.

martes, 12 de febrero de 2008

Prensa y Poder

Por César Hildebrandt

En La Primera 12 de febrero de 2008


¿Hasta dónde debería de llegar el poder de la prensa? Depende de la prensa. Si hablamos de la prensa que investiga de verdad, que separa los intereses de sus propietarios de la necesidad de servir a la opinión pública, la respuesta debería de ser: hasta donde la búsqueda de la verdad se lo permita.

Sin embargo, esa prensa ­ideal –el Post de los Graham, el New York Times antes de la enfermedad del “patriotismo”– está hoy desapareciendo o se debilita delante de nuestros ojos.

Muerto Jesús de Polanco, por ejemplo, El País, el mayor y mejor periódico escrito en castellano, se debate entre la presión de los sucesores de Polanco por defender el imperio de Prisa y las demandas de independencia de sus ya viejos lectores. Y, claro, aquí cuenta la casi confesión sincera de Juan Luis Cebrián, fundador de El País y siempre consejero de Prisa: “La prensa no puede dejar de ser un negocio rentable”.

Eso es cierto. Lo que pasa es que los negocios rentables de las demás esferas no tienen como producto la búsqueda de la verdad, que suele ser tan incómoda y tan explosiva para los negocios más rentables del planeta (la especulación financiera, el petróleo, la fabricación de armamento, el narcotráfico, el lavado de dinero).

¿Cómo, entonces, buscar la verdad que puede herir a los más poderosos y seguir contando con el favor publicitario y bancario de los poderosos?

Esa es la clave de todo el problema. Y ante ese dilema, la respuesta global de la gran prensa ha sido desactivar lo más que se pueda sus equipos de investigación y, simultáneamente, dirigir esa investigación a escudriñar las debilidades de los políticos –lo cual está muy bien–, pero a costa de no meterse con el poder monstruoso de las corporaciones (el verdadero gobierno de la aldea global).

¿Estará el futuro, entonces, en el llamado “periodismo ciudadano”, el que surge de los blogs del Internet? Lo dudo: cada día que pasa los blogs demuestran, con las excepciones del caso, los mismos defectos de la gran prensa y los mismos vicios del periodismo: el culto al poder establecido, la rutina mental que lleva a ser parte mineralizada “de lo políticamente correcto”, la creencia implícita de que la alianza entre el libre mercado y la democracia electoral es el fin de la historia (es decir el cese de todas las rebeldías). Y con un agravante: muchos blogs de investigación no son sino la expresión iracunda de filias y fobias personales y de investigación profesional no tienen nada. Confunden, con todo amateurismo además, papas con camotes y encima editorializan con la certeza del que cree que dudar es pecado mortal.

¿Dónde está la luz al final del túnel? Quizás en periódicos hechos por periodistas, falansterios de la comunicación que vendan masivamente, que puedan prescindir olímpicamente de la publicidad y que sean premiados con el ­éxito gracias a su demostrada independencia y rigor.

Rigor. Esa es la palabra que a muchos espanta. Porque una cosa es tomar la declaración de un testigo dudoso y convertir eso en un panfleto lapidario para la víctima de turno –eso es lo que se hace con quienes no tienen posibilidad de defenderse en igualdad de condiciones– y otra es investigar con inteligencia y recursos en búsqueda de una verdad generalmente oculta entre malezas y papeles de apariencia indescifrable, entre empresas de paja y tercerías con sede en islas del Caribe. La investigación requiere mucho talento y un poco de dinero. En nuestro medio ambos factores escasean: las universidades construyen preguntadores ingenuos –no investigadores– y a las empresas la investigación les interesa, por lo general, si es libre de gastos y más aún si está dirigida contra algún adversario del periódico.

Y rigor es lo que ha perdido la prensa peruana. Con excepción de Páez, Cruz y Uceda –topos pacientes y exitosos la mayor parte de las veces–, los profesionales de la investigación brillan por su ausencia. Han sido reemplazados muchas veces por difamadores de comida rápida, armadores de tramas que no se sostienen en el tiempo pero que pueden impactar en el instante.

Si alguien se dedicara a investigar el verdadero poder del narcotráfico en el Perú, ¿no llegaría a conclusiones sorprendentes? ¿Por qué nunca se investigó la compra de dólares negros de Uchiza por el Banco de Crédito? ¿Por qué no se nos dice cuántas pesqueras han sido alcanzadas por el poder que compra todo? ¿Por qué nunca sabremos cuántos periodistas preocupados por el poder de la cocaína la consumen en abundancia y se sienten omnipotentes e impunes por su inhalación?

¿Se han dado cuenta de que hay gente muy interesada en que nos quedemos con el menú del narcotraficante y asesino Fernando Zevallos, ya condenado a 20 años de cárcel y residente forzado de Piedras Gordas?

¿Cuántos exportadores han sacado coca por el puerto del Callao desde el día en que alguien anuló el control de la supervisoras privadas de aduana?

¿Quién quiere hacernos creer que el narcotráfico es un ­asunto que atañe sólo a narcos ya encarcelados y a mochileros del Vrae cuando es también “un asunto de blancos” –como se dice en el Perú que Jorge Bruce acaba de describir tan bien–?

Posdata: El ministerio público debería concederle a la quebradiza fiscal Luz Loayza la gracia de quedarse en Lima. Total, tampoco es que Maynas se sublevaría ante su ausencia. Y en cuanto a ti, Aldito: ¿por qué tratas de enlodar tan enfermizamente a quienes te recuerdan, con un poco de humor, tu viaje pagado por Suez Energy? Y siempre te vas de narices, Aldo querido: si la DEA me siguiera los pasos, como en tus sueños de opio ­imaginas, ya lo habría soltado, a todo pulmón, la doctora Loayza, que trabaja allí. En cuanto a mi inglés: es muy malo, pero es mejor que tu finlandés. ¿Te acuerdas?

lunes, 23 de abril de 2007

Evolution of the theses

The Age (Australia) April 21, 2007

Jonathan Gottschall takes a novel approach to Darwinism.

MARXIST, RADICAL feminist, Foucauldian, deconstructionist, post-colonial and queer. It reads like the fight card for an ideological battle royal. In fact, these are some of the major schools of thought in literary criticism from the past 40 years - and they have much in common.

Central to these and all other approaches to understanding literature that are influenced by post-structuralism is the idea that there is no innate human nature. Nature is nurture or, put another way, our nature is to spoon up whatever culture happens to feed us - and we are what we eat.

Understanding a story is ultimately about understanding the human mind. The primary job of the literary critic is to pry open the craniums of characters, authors and narrators, climb inside their heads and spelunk through the bewildering complexity within to figure out what makes them tick.

Yet, in doing this, literary scholars have ignored the recent scientific revolution that has transformed our understanding of why people behave the way they do. While evolutionary biologists have irreparably shattered the blank slate, most students of the humanities still insist that humans are born all but free of any innate qualities.

My fellow literary Darwinists and I hope to change their minds. By applying evolution-based thinking to fiction, we believe we can invigorate the study of literature, while at the same time mining an untapped source of information for the scientific study of human nature (see "Truth in fiction"). Darwinian thinking can help us better understand why characters act and think as they do, why plots and themes resonate within such very narrow bounds of variation, and the ultimate reasons for the human animal's strange, ardent love affair with stories.

It may sound like an innocent endeavour, but this is potentially revolutionary. If literary Darwinism is mainly right, then much of what has been written and said in the realm of literary theory and criticism in the second half of the 20th century is in need of significant revision.

Literary Darwinism has emerged during a period of crisis in literary studies. Enrolments and funding are in decline, books languish unpublished as readerships dwindle, and prospects for new PhDs are abysmal. Perhaps worst of all, literary scholars are at risk of being presented as a laughing stock by novelists and held up to ridicule by satirical journalists. There is a dreadful sense that the whole reputation of the study of the humanities is in free fall.

This drop feels all the more vertiginous given the soaring stock of the sciences. While many literary scholars have responded by trying to knock science down a peg, literary Darwinists have taken the opposite tack. We have posed two questions: what exactly is science doing right that we are doing so wrong, and can we emulate it?

I BEGAN ASKING THESE questions in the mid-1990s while I was working towards a PhD in English literature. At the time, I was sceptical of much of what I was being told in my literary theory courses, but my reasons were vague and disordered. These misgivings coalesced when I chanced across a tattered copy of the zoologist Desmond Morris' book The Naked Ape in a used-book store.

While the specifics of the 1960s bestseller were outdated, its general attitude towards human behaviour was not. Morris argued that although humans have complicated culture and a stunning capacity to learn, this does not change the fact that we are also animals, vertebrates, mammals, primates and, ultimately, great apes.

Aspects of our culture and intelligence mean we are different from other apes but do not emancipate us from biology or lift us above other animals onto an exalted link of the chain of being. What's more, it follows that the behavioural characteristics of the human animal, just like the physical ones, should be understood as the products of a long evolutionary process. Morris did not claim this rendered all other perspectives on human behaviour obsolete, just that an important fact had been neglected to the detriment of our understanding: people are apes.

At exactly the same time I was reading The Naked Ape I was re-reading Homer's Iliad for a graduate seminar on the great epics. As always, Homer made my bones flex and ache with the terror and beauty of the human condition. But this time around I also experienced the Iliad as a drama of naked apes - strutting, preening, fighting and bellowing their power in fierce competition for social dominance, beautiful women and material resources. Darwin's powerful lens brought sudden coherence to my experience of the story, inspiring me to abandon my half-drafted PhD dissertation and instead undertake a Darwinian analysis of the Iliad.

The study began with a simple observation. Intense competition between great apes, as described both by Homer and by primatologists, frequently boils down to precisely the same thing: access to females. In Homer, conflicts over Helen, Penelope and the slave girl Briseis are just the tip of the iceberg. The Trojan war is not only fought over Helen, it is fought over Hector's Andromache and all the nameless women of ordinary Trojan men.

"Don't anyone hurry to return homeward until after he has lain down alongside a wife of some Trojan," the old counsellor Nestor exhorts the Greeks. Capturing women was not just a perk of war, it was one of the important reasons for war. Achilles conveys this in his soul-searching assessment of his life as warrior: "I have spent many sleepless nights and bloody days in battle, fighting men for their women."

THE INTENSE competition for women suggests they were scarce. Some scholars have raised the possibility that Homeric peoples, including the Greeks of the eighth century BC, practised female infanticide. I argue that a potentially more important cultural practice has been overlooked. Although Homeric men did not have multiple wives, most leading men were polygynous: in addition to their wives they hoarded slave women whom they treated as their sexual property.

For every extra woman possessed by a high-status man, some less fortunate or less formidable Greek lacked a wife. Comparative anthropology shows the results of such a situation are all but guaranteed. Wherever there are "missing females" - from modern China and India to ancient Greece - there will be strife over women and fierce competition among men for the wealth and prestige needed to attract them.

My study of Homer is informed by insights from a range of sciences including evolutionary biology, behavioural genetics, evolutionary and developmental psychology and cognitive science - what Harvard University psychologist Steven Pinker calls "the new sciences of human nature". But while the theory driving the study is scientific, the methods are not.

Lately, my colleagues and I have been seeking to apply scientific methods in our investigations of literature. These efforts crash up against the scepticism of our peers - against a widespread feeling that any attempt to formulate a "literary science" is risibly oxymoronic. Our critics argue that literary scholars - Marxists, psychoanalysts, structuralists - have repeatedly tried to make the discipline more scientific and that these miserable experiments in science-envy have always ended in farce. This is true, but literary Darwinism is different.

While these approaches imported concepts, jargon and data from more scientific fields, they never attempted to adopt the scientific method, developing competing hypotheses and empirically testing them. To anyone who wonders how there can be a science of literature that assigns numbers to the riot of information conveyed in a text, we answer that it is not easy, but it can be done.

Take the study recently completed by the leading figure in literary Darwinism, Joseph Carroll from the University of Missouri - St Louis, in collaboration with myself and psychologists John Johnson from Pennsylvania State University in DuBois and Daniel Kruger from the University of Michigan in Ann Arbor.

This web-based survey of more than 500 avid readers was designed to test specific hypotheses at the nexus of literature and evolutionary science. Respondents answered questions about the motives, mate-selection strategies and personalities of 144 principal characters in a broad selection of Victorian novels, and rated their emotional responses to the characters.

What did we find? First, that readers' responses reflect an evolved psychological tendency to envision human social relations as morally polarised struggles between "us" and "them". Protagonists and their allies form co-operative communities that readers empathise with and participate vicariously in. By contrast, readers tend to view antagonists and their allies as an "out-group" - a malign force, motivated by a desire for social dominance as an end in itself, that threatens the very principle of community.

IN ADDITION, THE DATA also allowed us to weigh in on some old and acrimonious literary debates. For instance, scholars have long argued about whether authors tightly control literary meaning, or whether readers create their own highly idiosyncratic interpretations of the novels they read. In recent decades, the most influential figures in literary analysis have promoted the latter view, spawning the mantra of "the death of the author".

Our findings contradict this. While readers do vary in their emotional and analytical responses, the variation is contained within tight boundaries. At least as far as the Victorian novel goes, the author is alive and well, expertly orchestrating reader response.

To take one more example, feminist scholars have long maintained that European fairytales wantonly inflict psychic violence upon the vulnerable minds of children, especially girls, by promoting stereotypical gender roles. They maintain that images of swashbuckling heroes and beautiful young maidens yearning for dashing princes are not in any sense "natural", but instead reflect and perpetuate the arbitrary gender arrangements of patriarchal Western culture.

To test this assertion, I convened a team of content analysts to gather quantitative data on the depiction of folk-tale characters from all around the world. What we found was that the feminist critique is both right and wrong.

European tales do portray men as more active and more physically courageous, while females are much less likely to be the main character and have far more emphasis placed on their beauty. But it also became clear that these stereotypes are not merely constructed to reinforce male hegemony in Western societies.

We encountered precisely the same gender descriptions wherever we moved through the landscape of world folk tales - across continents, cultures and centuries, and in all societies from hunter-gatherer to pre-industrial. While cultural attitudes undeniably influence gender identity, some differences between male and female folk-tale characters are universal, perhaps because they have deep roots in biological differences between the sexes.

Literary Darwinism is still at a stage of adolescent awkwardness. Nevertheless, we believe our approach has the potential to breathe new life into a struggling field. In literary studies, faulty theories of human nature have given rise to faulty theories of literature, which have in turn generated faulty hypotheses.

BECAUSE LITERARY methods are exclusively non-quantitative and often impressionistic, these hypotheses have rarely been systematically tested. As a result, literary scholars have seldom produced knowledge that can withstand the critiques of the next generation. At least literary Darwinism offers hope of breaking out of this cycle. At best we will start to build a literary understanding that can progress in much the same way that science progresses. It is a bold experiment that may not succeed, but what experiment worth doing is risk-free?

- NEW SCIENTIST

jueves, 5 de abril de 2007

BOOK REVIEW

'Leni: The Life and Work of Leni Riefenstahl' by Steven Bach

She overlooked the evils and emphasized the romance of Nazi power.

By Richard Schickel

Leni: The Life and Work of Leni Riefenstahl
Steven Bach
Alfred A. Knopf: 386 pp., $30

Leni Riefenstahl was a slut. Steven Bach is too graceful a writer and too nuanced a psychologist to summarize this life so bluntly, but, for the reader of his brilliant biography of the Nazi filmmaker, that conclusion is inescapable.

We are not speaking primarily of her sexual life, though it was relentlessly busy (her taste ran to hunky jock types and, equally, to men who could advance her career). That epithet applies also to her blind — and blinding — ambition. There was no one she would not try to seduce, in one way or another, in pursuit of fame, fortune and power — including, of course, smitten, impotent Adolf Hitler, who was über alles among her admirers.

With "Triumph of the Will" (about the Nazi party rally at Nuremberg in 1934) and "Olympiad" (about the 1936 Berlin Olympic Games), Riefenstahl, it's not an exaggeration to say, created almost every significant visual image that we now retain of National Socialism in all its evil pomp. Later, when the Thousand-Year Reich turned out to have a rather shorter life span than its propagandists predicted and she lived rather longer than normal (she died at the age of 101 in 2003), she devoted most of her energy to litigious self-justification of her years as Hitler's willing executioner of imagery. In essence, she fought her 58-year defensive battle in the same way that she had pursued her more meteoric advance to global fame — under the flag of artistic purity. As she would have it, she aspired only to the sublime, and that shining light blinded her to rumors of concentration camps, Gestapo torture chambers and the gas ovens.

Riefenstahl claimed, probably truthfully, that she was never a Nazi party member and evaded the worst punishments of the postwar denazification process, though she never again made a significant film. Over these later years, she attracted the support of gaga cinephiles, who inanely insisted, as one of them put it, that "politics and art must never be confused." It is biographer Bach's business to demolish that nonsense while also creating an almost novelistically compelling narrative of a life endlessly obfuscated by lies.

The daughter of a plumber, Riefenstahl began her public life as an "interpretive" dancer in the Modernist vein and then did a turn (which she later denied) dancing semi-nude in the film "Ways to Strength and Beauty." She achieved eminence first as a star, then as a director, of "mountain films," a popular, peculiarly Germanic genre in which wild, primitive people dare to scale beautiful yet menacing Alpine peaks, achieving death and transfiguration at the end of their exertions. At the time, most people viewed these movies as escapist, though Siegfried Kracauer (a mere critic at the time, not yet the eminent historian of German film he would become) saw in these films something "symptomatic of an antirationalism on which the Nazis could capitalize."

There was perhaps more to it than that. As Susan Sontag wrote in her seminal essay "Fascinating Fascism," the mountain films offered "a visually irresistible metaphor for unlimited aspiration toward the high mystic goal, both beautiful and terrifying, which was later to become concrete in Führer-worship." The would-be Führer saw this. And Riefenstahl, his would-be acolyte, was paying attention too. She read "Mein Kampf" and, typically, pressed that noxious rant upon a Jewish lover, saying, "Harry, you must read this book. This is the coming man."

Adolf and Leni were mutually enthralled from the moment they met — to the point that the world's tabloid press kept ludicrously hinting at a sexual liaison. They had something better; they were soul mates. To her dying day, she insisted that "Triumph of the Will" was cinéma vérité, a morally neutral record of a great historical event. But Albert Speer, Hitler's kept architect, was essentially her art director, the occasion was staged with her camera positions always in mind, and the film was financed entirely with government funds. The same was true of her Olympic film. She always claimed that Joseph Goebbels, Hitler's propaganda minister, was her enemy, but Bach is particularly good at unraveling that whopper. Goebbels resented her direct line to Hitler — she was the only German director not obliged to submit to his dictates — but their squabbles were mainly bureaucratic, and Goebbels' diary entries about her are mostly admiring.

Why would they have been otherwise? "Triumph" and "Olympiad" celebrate the official Nazi message: "Strength Through Joy." The former offers heroic shots of young Aryans larkishly bathing in their encampments before assembling into impressive masses, their individuality welded into anonymous yet strangely glamorous menace. The Olympic movie was more in the spirit of the mountain films: In company with a beamish Hitler, gorgeous and graceful athletes (Leni, incidentally, was having an affair with an American decathlon winner) idealistically strain for metaphorical mountaintops. The "purity" of their efforts sends an anti-intellectual, or blood and iron, message to sausage-stuffed flatlanders — and, of course, to Jews, who were viciously scorned by Goebbels and company.

In short, Riefenstahl's two major films aestheticized and romanticized fascist values. The dazzling geometries of masses on the march may have been in the cinematic air just then: Look for Riefenstahl's sources in Busby Berkeley's musical extravaganzas as well as in the 1932 German communist film "Kuhle Wampe" (co-written by Bertolt Brecht). But backed by the full faith and credit of an evil government providing thousands of malleable extras, she could provide grand spectacle on an unprecedented scale. Why Riefenstahl's work would continue to impress critics — even Sontag, Riefenstahl's most implacable critical enemy, calls them the two greatest documentaries ever made — is a mystery, given the corruption of their origins and the fact that they are visibly not documentaries at all.

With world war looming, the international film community was titillated but ultimately shunned Riefenstahl's gifts while her chief patron was, shall we say, distracted by more pressing matters. She was a silent witness to an atrocity in Poland early in the war (though she later claimed to have protested the massacre), and during the filming of "Tiefland" blithely employed as extras some Gypsy slave laborers who later perished in death camps. It was a sort of neo-mountain film, personally financed by Hitler but released after the war to a numbed response. By then, she was fighting tigerishly to distance herself from Hitler, though Bach has uncovered much damning gush from her to him. At the end of her life, Riefenstahl discovered a primitive African tribe, the Nubia, and found in them the noble savagery she had celebrated in the Alpine films. She published a beautiful, disturbing picture book about them which had a certain rehabilitative effect on her reputation — though not for Bach or this reader.

It is difficult to overpraise Bach's efforts: Living the biographer's nightmare, trapped for a decade with a loathsome subject, Bach is determined to present her coolly, ironically, without loss of his own moral vector. What emerges is a compulsively readable and scrupulously crafted work, not unlike Klaus Mann's "Mephisto," that devastating novel about the actor Gustav Gründgens, another of Hitler's several semiconscious cultural ornaments-apologists. I do not believe this fundamentally ignorant woman ever perceived the inherent evil in Nazism. Her anti-Semitism was less virulent than reflexive — the common coin of many realms (including the United States) at the time. The disguise she wrapped around her ambition was that absurd, often unpleasant and peculiarly European one of the Grand Maestro, all art for art's sake — hysteria and narcissism mixed with contempt for her collaborators, grandiose graciousness to her groveling fans and patrons, and a talent that was all technique, no soul. She stood deluded at the center of evil and saw it only as a source of funding.

Bach ends his book with a quotation from Simone Weil: "The only people who can give the impression of having risen to a higher plane, who seem superior to ordinary human misery, are people who resort to the aids of illusion, exaltation, fanaticism, to conceal the harshness of destiny from their own eyes. The man who does not wear the armor of the lie cannot experience force without being touched by it to his very soul."

Which brings me back to the point at which I began. Leni Riefenstahl used and was used heedlessly and amorally. That would have been true even if she had functioned in a liberal democracy, where she would have acted just as she did in Hitler's Germany, insisting that her aspirations were for only the finest things. What she received for her efforts were the metaphorical mink coats and diamond bracelets of the whoredom that never speaks its name — because it cannot imagine the word applying to an artiste of such impeccable idealism.


Richard Schickel is a film critic for Time and the author of many books, including "Elia Kazan: A Biography" and "The Essential Chaplin."

martes, 6 de marzo de 2007

Commentary Magazine


All That Jazz
Terry Teachout

March 2007



Rare are the writers willing to undertake large-scale histories aimed at a general audience. Yet when such books are engagingly and accessibly written, sufficiently comprehensive, and animated by a strongly personal point of view—as are H.G. Wells’s The Outline of History (1920), E.H. Gombrich’s The Story of Art (1950), and Paul Johnson’s Modern Times (1983)—they can become both popular and influential.

When it comes to jazz, comparatively few attempts have been made to write a general history that fulfills these requirements, and only a half-dozen such books have appeared since World War II. The most ambitious of them, Alyn Shipton’s A New History of Jazz, was not widely noticed in this country on its original release in 2001.1 But it has now been reissued in an expanded and extensively revised edition. At over 800 closely packed pages, this is the most detailed historical survey of jazz yet to be published.2

Like all such books, A New History of Jazz has its share of errors and other flaws, and its length will no doubt prevent it from being taken up by the public at large. Yet Shipton, an English broadcaster and musician whose previous books include biographies of Dizzy Gillespie (1999) and Fats Waller (1988), has done more than any previous commentator to cut through the thick underbrush of unsubstantiated opinion and provide a clearly written, factually trustworthy account of jazz’s complex and controversial history. If A New History of Jazz is not the ideal single-volume chronicle for which lovers of this music have been waiting, that is in part because jazz itself is peculiarly resistant to such concise treatment.

_____________

Why have so few general histories of jazz been produced? In addition to being a relatively young art form, jazz is also a vernacular music that is usually (though not always) played in commercial settings. For this reason, scholars in America and elsewhere were long reluctant to take it seriously. Academic research into its origins and early development did not begin in earnest until after most of its founding figures were dead. This meant that, for much of its century-long history, jazz was written about mainly by enthusiasts whose technical knowledge of music was limited or nonexistent. As Gunther Schuller observed in Early Jazz: Its Roots and Musical Development (1968), the first full-length historical-analytic study by a scholar with professional performing experience:

The majority of [jazz] books have concentrated on the legendry of jazz, and over the years a body of writing has accumulated which is little more than an amalgam of well-meaning amateur criticism and fascinated opinion.

By 1950, “legendry” had hardened into a widely accepted narrative not unlike a creation myth. In the baldest form of this myth, jazz was created at the turn of the 20th century by a group of black New Orleans musicians descended from slaves who “Westernized” the polyrhythms and microtonal melodic inflections of their African ancestors, thereby bringing into being a new form of improvised folk music played by small instrumental ensembles. After World War I, the most gifted of these men emigrated to Chicago and (later) other American cities, where their music was embraced by progressive-minded musicians and listeners.

One uniquely talented émigré by the name of Louis Armstrong (the myth continues) broke with the ensemble tradition of his youth to forge a virtuoso solo idiom that became the font of all subsequent stylistic developments in jazz. Armstrong in turn was followed by a series of black innovators who, building on his achievements, expanded the language of jazz still further. Thus, every major jazz musician can trace his stylistic descent through Armstrong to the black ur-jazz of New Orleans, which is the sole source of the music’s authentic mainstream.

Virtually all of the earliest general histories and analytic studies of jazz—including Robert Goffin’s Aux Frontières du Jazz (1932), Wilder Hobson’s American Jazz Music (1939), Frederic Ramsey and Charles Edward Smith’s Jazzmen (1939), Hughes Panassié’s The Real Jazz (1942), Marshall Stearns’s The Story of Jazz (1956), and Martin Williams’s The Jazz Tradition (1970)—took the broad accuracy of this myth more or less for granted. Even though it simplified and misrepresented history in any number of significant ways, the absence of serious primary-source research into the origins of jazz made it inevitable that “legendry” would get the better of fact. Indeed, to this day the creation myth continues to be espoused (albeit in a more subtle form) by amateur historians like Stanley Crouch and Ken Burns.3

This picture began to change with the appearance of such musically trained commentators as Schuller, Richard M. Sudhalter, and Max Harrison. Schuller’s Early Jazz and Sudhalter’s Bix: Man and Legend (1974, co-written with Philip R. Evans and William Dean-Myatt)—the first primary-source biography of a major jazz musician—set new standards for jazz historians, while the New Grove Dictionary of Music and Musicians (1980) included a highly sophisticated key article on jazz written by Harrison.4 By the 90’s, serious scholarship had started to come into its own, and factually reliable books like Ted Gioia’s West Coast Jazz: Modern Jazz in California, 1945-1960 (1992), Scott DeVeaux’s The Birth of Bebop: A Social and Musical History (1997), and Sudhalter’s Lost Chords: White Musicians and Their Contribution to Jazz, 1915-1945 (1999) were becoming common, if not commonplace.

_____________

The first “modern” survey histories of jazz, Frank Tirro’s Jazz: A History (1977) and James Lincoln Collier’s The Making of Jazz: A Comprehensive History (1978), came along too soon to profit from the new jazz scholarship. But by 1997, when Ted Gioia published The History of Jazz, it was possible for such authors to draw on a substantial body of primary research untainted by starry-eyed legendry.

Gioia himself succeeded to a considerable degree in breaking away from the tattered remnants of the creation myth. Still, his discussion of jazz in the 20’s and 30’s was too superficial to give a clear picture of the music’s growth (he was much stronger on post-1945 stylistic developments). Moreover, he failed to define his intended audience with sufficient precision, meaning that his book, for all its considerable virtues, fell between two stools. As I wrote in a review of The History of Jazz in the Wall Street Journal, “Fans would have been better served by a book half as long, musicians by one twice as long.”

For whom, then, is Shipton’s A New History of Jazz intended? Like Gioia’s book, it is written in (mostly) non-technical language and contains no musical examples, suggesting that it is meant for the general reader. But it is twice as long as The History of Jazz, and my guess is that untrained amateurs will also be put off by its proliferation of detail. On the other hand, its great length is precisely what enables Shipton to avoid oversimplifying the story of jazz’s development.

Most historical surveys, by focusing on a restricted number of major figures, give the false impression that jazz was invented ab ovo by a handful of creative giants, and that its history can be correctly understood as an unbroken mainstream of progress running from New Orleans in 1900 to the present day. In fact, however, nobody “invented” jazz, and its “mainstream” is a series of parallel lines of development that converge in some places but diverge in others.

As Shipton explains, popular musicians throughout America, many of whom had “little or no first-hand exposure to New Orleans musicians,” were experimenting at the turn of the 20th century with new styles of dance music in which syncopation played a prominent part. It was out of this widespread musical ferment that jazz emerged, and while it appears to have first taken recognizable shape in New Orleans, it was being played in other cities around the same time or shortly afterward.

What was true in the beginning remained true thereafter. Jazz has been played in many different ways throughout its history, and, as Shipton makes clear, there has never been a single “authentic” style conclusively superior to all others. Louis Armstrong is the only figure whose stylistic innovations achieved anything like universal currency, and Armstrong himself, for all his extraordinary originality, was only one of a number of musicians who helped shape the language of early jazz. Accordingly, Shipton presents his history not as a lineal succession of great men but as an overlapping series of stylistic movements, each of which he describes with a catholicity of taste unusual among jazz commentators.

Time and again Shipton steers clear of the errors that have tripped up so many of his predecessors—especially those who have insisted on viewing jazz through a racial prism. White ensembles and players like the Original Dixieland Jazz Band, Paul Whiteman, and Woody Herman are taken as seriously and treated as fairly in A New History of Jazz as are Jelly Roll Morton, Fletcher Henderson, and Count Basie. Nor is this fair-mindedness limited to the case of white artists, as can be seen from Shipton’s thoughtful discussions of figures whose popular success caused them to be inadequately appreciated by critics: among others, Cab Calloway, the John Kirby Sextet, and the Mills Brothers.

Above all, Shipton has a gift for crisp, vivid summary without which it is impossible to write an effective survey history—a gift rooted in the fact that while he is a performing musician, he has also spent much of his career working as a journalist. A case in point is his treatment of the sharply contrasting styles of Coleman Hawkins and Lester Young, the two most influential saxophonists of the swing era:

However brilliantly and rhapsodically Hawkins built his solos, such as on “Body and Soul,” he usually did so by moving away from the composer’s original melody as quickly as was practical to do so, after milking it for the dramatic effect of his opening statements, and then relying almost totally on the harmonic framework of the piece. Young, however, was much more of a melodist . . . and he preferred to superimpose the logic of his melodic lines over an underlying chord structure, even when those chords were more complex than his melodic ideas.

No less striking is the ease with which Shipton negotiates the great stylistic divide that separates pre- and post-1960 jazz. Most authors of survey histories in any field of art come to grief when writing about movements for which they feel no sympathy. Paul Johnson’s Art: A New History (2003), for instance, is willfully, almost obsessively dismissive of modern art. In A New History of Jazz, by contrast, the avant-garde jazz of the 60’s is described with a relish rarely to be found among performers who, like Shipton, have been closely identified with traditional jazz.5

_____________

Still, despite its author’s wide-ranging sympathies, the last quarter of A New History of Jazz, which carries the story from 1960 to the present day, is far less confident than that which precedes it, while the final hundred pages are scarcely more than a hectic, ill-sorted catalogue of present-day performers in which many major figures (like the guitarist Bill Frisell) are mentioned only in passing or (like the composer-bandleader Maria Schneider) are omitted altogether.

Part of Shipton’s problem is that he appears to subscribe to the idea of aesthetic progress. As a result, he has little interest in conservatively-inclined younger players, like the guitarist-vocalist John Pizzarelli, who seek to explore and revitalize older styles. Their work, he claims, “does not have the sense of adventure or excitement that has generally been associated with the cutting edge of jazz.” (The cant phrase “cutting edge” suggests that Shipton is here bowing to conventional critical wisdom rather than thinking for himself.) Nor does he consider the later careers of important older artists, like the guitarist Jim Hall and the valve trombonist Bob Brookmeyer, who came to prominence in the 50’s but have done their most original and distinctive work in the past quarter-century.

To be sure, it may be that contemporary jazz simply does not lend itself to the narrative style employed so effectively in the earlier sections of A New History. Prior to 1970, jazz’s fast-growing stylistic diversity had not yet compromised the underlying integrity of its common musical language. Even the truly radical innovations of avant-gardists of the 60’s like the alto saxophonist Ornette Coleman were rooted in a body of performance practices known to all musicians and listeners. Not only did the quartet that recorded such early Coleman albums as The Shape of Jazz to Come (1959) and Change of the Century (1960) feature a conventional instrumentation—saxophone, trumpet, bass, and drums—but its members played in a style self-evidently related, however distantly, to traditional jazz. Coleman’s solos, for instance, contained the same “vocalized” inflections heard in the playing of Charlie Parker; and Charlie Haden, his bassist, accompanied them with walking-bass lines similar to those used in swing and bebop. To put it another way, many people doubted that Coleman’s music was good jazz, but because it sounded like jazz, few refused to admit that it was jazz.

After 1970, though, this commonality of practice began to grow increasingly tenuous, ultimately to the edge of nullity. In “Postmodern Jazz,” the final chapter of A New History, Shipton admits that while his pre-1970 history appears to be “a straightforward narrative” marked by “a clear sense of development,” contemporary jazz can no longer be described in such terms. Thanks to “the virtual availability, in recorded form,” of jazz’s entire history,

[i]t is possible for a musician to embark on a career at the beginning of the 21st century and choose to assimilate elements from almost any style in the history of jazz as a starting point. . . . The days when musicians learned at the knees of older players, served their apprenticeships in big bands, participated in after-hours jam sessions, congregated in dressing rooms for impromptu opportunities to play, have all largely gone.

The result, as Max Harrison presciently noted in his 1980 New Grove article, is that “jazz no longer has a lingua franca. . . . There is instead an extreme diversity of styles and methods, and this situation is international.” If anything, this diversity has grown more pronounced since 1980. A postmodern group like the Bad Plus, for instance, uses the same instrumentation as the Bill Charlap Trio—piano, acoustic bass, and drums—and most musicians consider it to be a jazz ensemble. Yet its blunt, explosively loud versions of such rock-and-roll songs as Blondie’s “Heart of Glass” and Black Sabbath’s “Iron Man” appear at first glance to have nothing in common with Charlap’s elegantly subdued yet swinging interpretations of standard ballads like Harold Arlen’s “The Man That Got Away.”

Does the Bad Plus (which is not included in A New History) play jazz, or some other, newer kind of music? And does the fact that the music of Charlap (who also goes unmentioned by Shipton) is less obviously original than that of the Bad Plus somehow make it less good? Or was Max Harrison right to claim that in the brave new world of postmodern jazz, all styles are equally valid and equally “jazzy”?

_____________

In recent years, many jazz musicians have looked for the answers to such questions in a famous remark made by the pianist Bill Evans and quoted in A New History:

Jazz is not a what, it is a how. If it were a what, it would be static, never growing. The how is that the music comes from the moment, it is spontaneous, it exists in the time it is created. And anyone who makes music according to this method conveys to me an element that makes his music jazz.

Alyn Shipton clearly understands the implications of this remark, and the catholicity with which he describes pre-1970 jazz promises an equally clear understanding of later styles. “In what follows,” he writes in his introduction, “I have attempted to examine what was being described as jazz throughout its history, and I have taken a very broad view of how jazz should now be defined.” But, despite this broad perspective, he does not succeed in integrating postmodern jazz into his narrative.

His failure to do so reinforces my own belief that it is not yet possible to write a coherent historical survey that includes post-1970 stylistic developments. Not only are we too close in time to the jazz of the 70’s, 80’s, and 90’s to write about it with detachment, but it is by no means clear that postmodern jazz is itself sufficiently coherent to be grasped as a unified phenomenon continuous with pre-1970 predecessors.6

Still, even if the many kinds of music that we continue to call “jazz” no longer have enough in common to be discussed collectively, most listeners and critics, myself included, stubbornly persist in viewing them as parts of a whole, unified (in Bill Evans’s words) not by their “whatness” but by their “howness.” Perhaps some jazz scholar as yet unborn will be able to explain to our children why we were right to do so.

Respond to this Article
Respond to this Article

(page 1 of 1 - view all)
Write a Letter to the Editor
Subject:

All That Jazz
Your Name: Your Email Address: Message:
Footnotes

1 Richard M. Sudhalter reviewed it in the March 2002 COMMENTARY.

2 Continuum, 832 pp., $34.95.

3 For a discussion of the amateur tradition in jazz writing, see my essay "Jazz and Its Explainers" (COMMENTARY, February 2001).

4 A revised version of Harrison's article was reprinted separately in the one-voume New Grove Gospel, Blues, and Jazz (1986).

5 In his zeal for the 60's, Shipton unfortunately short-changes or omits discussion of such admired earlier players as the clarinetist Pee Wee Russel, the trombonist Jack Teagarden, and the drummers Sid Catlett and Dave Tough.

6 For this reason, my three-part survey of the history of recorded jazz (published in COMMENTARY in November and December 1999 and January 2000) came to a close with Weather Report's "Birdland," recorded in 1977.
About the Author

Terry Teachout, COMMENTARY’s regular music critic and the drama critic of the Wall Street Journal, is writing Hotter Than That: A Life of Louis Armstrong. He blogs about the arts at www.terryteachout.com.

© 2007 Commentary

lunes, 5 de marzo de 2007

The New York Times

March 4, 2007

Darwin’s God

God has always been a puzzle for Scott Atran. When he was 10 years old, he scrawled a plaintive message on the wall of his bedroom in Baltimore. “God exists,” he wrote in black and orange paint, “or if he doesn’t, we’re in trouble.” Atran has been struggling with questions about religion ever since — why he himself no longer believes in God and why so many other people, everywhere in the world, apparently do.

Call it God; call it superstition; call it, as Atran does, “belief in hope beyond reason” — whatever you call it, there seems an inherent human drive to believe in something transcendent, unfathomable and otherworldly, something beyond the reach or understanding of science. “Why do we cross our fingers during turbulence, even the most atheistic among us?” asked Atran when we spoke at his Upper West Side pied-à-terre in January. Atran, who is 55, is an anthropologist at the National Center for Scientific Research in Paris, with joint appointments at the University of Michigan and the John Jay College of Criminal Justice in New York. His research interests include cognitive science and evolutionary biology, and sometimes he presents students with a wooden box that he pretends is an African relic. “If you have negative sentiments toward religion,” he tells them, “the box will destroy whatever you put inside it.” Many of his students say they doubt the existence of God, but in this demonstration they act as if they believe in something. Put your pencil into the magic box, he tells them, and the nonbelievers do so blithely. Put in your driver’s license, he says, and most do, but only after significant hesitation. And when he tells them to put in their hands, few will.

If they don’t believe in God, what exactly are they afraid of?

Atran first conducted the magic-box demonstration in the 1980s, when he was at Cambridge University studying the nature of religious belief. He had received a doctorate in anthropology from Columbia University and, in the course of his fieldwork, saw evidence of religion everywhere he looked — at archaeological digs in Israel, among the Mayans in Guatemala, in artifact drawers at the American Museum of Natural History in New York. Atran is Darwinian in his approach, which means he tries to explain behavior by how it might once have solved problems of survival and reproduction for our early ancestors. But it was not clear to him what evolutionary problems might have been solved by religious belief. Religion seemed to use up physical and mental resources without an obvious benefit for survival. Why, he wondered, was religion so pervasive, when it was something that seemed so costly from an evolutionary point of view?

The magic-box demonstration helped set Atran on a career studying why humans might have evolved to be religious, something few people were doing back in the ’80s. Today, the effort has gained momentum, as scientists search for an evolutionary explanation for why belief in God exists — not whether God exists, which is a matter for philosophers and theologians, but why the belief does.

This is different from the scientific assault on religion that has been garnering attention recently, in the form of best-selling books from scientific atheists who see religion as a scourge. In “The God Delusion,” published last year and still on best-seller lists, the Oxford evolutionary biologist Richard Dawkins concludes that religion is nothing more than a useless, and sometimes dangerous, evolutionary accident. “Religious behavior may be a misfiring, an unfortunate byproduct of an underlying psychological propensity which in other circumstances is, or once was, useful,” Dawkins wrote. He is joined by two other best-selling authors — Sam Harris, who wrote “The End of Faith,” and Daniel Dennett, a philosopher at Tufts University who wrote “Breaking the Spell.” The three men differ in their personal styles and whether they are engaged in a battle against religiosity, but their names are often mentioned together. They have been portrayed as an unholy trinity of neo-atheists, promoting their secular world view with a fervor that seems almost evangelical.

Lost in the hullabaloo over the neo-atheists is a quieter and potentially more illuminating debate. It is taking place not between science and religion but within science itself, specifically among the scientists studying the evolution of religion. These scholars tend to agree on one point: that religious belief is an outgrowth of brain architecture that evolved during early human history. What they disagree about is why a tendency to believe evolved, whether it was because belief itself was adaptive or because it was just an evolutionary byproduct, a mere consequence of some other adaptation in the evolution of the human brain.

Which is the better biological explanation for a belief in God — evolutionary adaptation or neurological accident? Is there something about the cognitive functioning of humans that makes us receptive to belief in a supernatural deity? And if scientists are able to explain God, what then? Is explaining religion the same thing as explaining it away? Are the nonbelievers right, and is religion at its core an empty undertaking, a misdirection, a vestigial artifact of a primitive mind? Or are the believers right, and does the fact that we have the mental capacities for discerning God suggest that it was God who put them there?

In short, are we hard-wired to believe in God? And if we are, how and why did that happen?

“All of our raptures and our drynesses, our longings and pantings, our questions and beliefs . . . are equally organically founded,” William James wrote in “The Varieties of Religious Experience.” James, who taught philosophy and experimental psychology at Harvard for more than 30 years, based his book on a 1901 lecture series in which he took some early tentative steps at breaching the science-religion divide.

In the century that followed, a polite convention generally separated science and religion, at least in much of the Western world. Science, as the old trope had it, was assigned the territory that describes how the heavens go; religion, how to go to heaven.

Anthropologists like Atran and psychologists as far back as James had been looking at the roots of religion, but the mutual hands-off policy really began to shift in the 1990s. Religion made incursions into the traditional domain of science with attempts to bring intelligent design into the biology classroom and to choke off human embryonic stem-cell research on religious grounds. Scientists responded with counterincursions. Experts from the hard sciences, like evolutionary biology and cognitive neuroscience, joined anthropologists and psychologists in the study of religion, making God an object of scientific inquiry.

The debate over why belief evolved is between byproduct theorists and adaptationists. You might think that the byproduct theorists would tend to be nonbelievers, looking for a way to explain religion as a fluke, while the adaptationists would be more likely to be believers who can intuit the emotional, spiritual and community advantages that accompany faith. Or you might think they would all be atheists, because what believer would want to subject his own devotion to rationalism’s cold, hard scrutiny? But a scientist’s personal religious view does not always predict which side he will take. And this is just one sign of how complex and surprising this debate has become.

Angels, demons, spirits, wizards, gods and witches have peppered folk religions since mankind first started telling stories. Charles Darwin noted this in “The Descent of Man.” “A belief in all-pervading spiritual agencies,” he wrote, “seems to be universal.” According to anthropologists, religions that share certain supernatural features — belief in a noncorporeal God or gods, belief in the afterlife, belief in the ability of prayer or ritual to change the course of human events — are found in virtually every culture on earth.

This is certainly true in the United States. About 6 in 10 Americans, according to a 2005 Harris Poll, believe in the devil and hell, and about 7 in 10 believe in angels, heaven and the existence of miracles and of life after death. A 2006 survey at Baylor University found that 92 percent of respondents believe in a personal God — that is, a God with a distinct set of character traits ranging from “distant” to “benevolent.”

When a trait is universal, evolutionary biologists look for a genetic explanation and wonder how that gene or genes might enhance survival or reproductive success. In many ways, it’s an exercise in post-hoc hypothesizing: what would have been the advantage, when the human species first evolved, for an individual who happened to have a mutation that led to, say, a smaller jaw, a bigger forehead, a better thumb? How about certain behavioral traits, like a tendency for risk-taking or for kindness?

Atran saw such questions as a puzzle when applied to religion. So many aspects of religious belief involve misattribution and misunderstanding of the real world. Wouldn’t this be a liability in the survival-of-the-fittest competition? To Atran, religious belief requires taking “what is materially false to be true” and “what is materially true to be false.” One example of this is the belief that even after someone dies and the body demonstrably disintegrates, that person will still exist, will still be able to laugh and cry, to feel pain and joy. This confusion “does not appear to be a reasonable evolutionary strategy,” Atran wrote in “In Gods We Trust: The Evolutionary Landscape of Religion” in 2002. “Imagine another animal that took injury for health or big for small or fast for slow or dead for alive. It’s unlikely that such a species could survive.” He began to look for a sideways explanation: if religious belief was not adaptive, perhaps it was associated with something else that was.

Atran intended to study mathematics when he entered Columbia as a precocious 17-year-old. But he was distracted by the radical politics of the late ’60s. One day in his freshman year, he found himself at an antiwar rally listening to Margaret Mead, then perhaps the most famous anthropologist in America. Atran, dressed in a flamboyant Uncle Sam suit, stood up and called her a sellout for saying the protesters should be writing to their congressmen instead of staging demonstrations. “Young man,” the unflappable Mead said, “why don’t you come see me in my office?”

Atran, equally unflappable, did go to see her — and ended up working for Mead, spending much of his time exploring the cabinets of curiosities in her tower office at the American Museum of Natural History. Soon he switched his major to anthropology.

Many of the museum specimens were religious, Atran says. So were the artifacts he dug up on archaeological excursions in Israel in the early ’70s. Wherever he turned, he encountered the passion of religious belief. Why, he wondered, did people work so hard against their preference for logical explanations to maintain two views of the world, the real and the unreal, the intuitive and the counterintuitive?

Maybe cognitive effort was precisely the point. Maybe it took less mental work than Atran realized to hold belief in God in one’s mind. Maybe, in fact, belief was the default position for the human mind, something that took no cognitive effort at all.

While still an undergraduate, Atran decided to explore these questions by organizing a conference on universal aspects of culture and inviting all his intellectual heroes: the linguist Noam Chomsky, the psychologist Jean Piaget, the anthropologists Claude Levi-Strauss and Gregory Bateson (who was also Margaret Mead’s ex-husband), the Nobel Prize-winning biologists Jacques Monod and Francois Jacob. It was 1974, and the only site he could find for the conference was at a location just outside Paris. Atran was a scraggly 22-year-old with a guitar who had learned his French from comic books. To his astonishment, everyone he invited agreed to come.

Atran is a sociable man with sharp hazel eyes, who sparks provocative conversations the way other men pick bar fights. As he traveled in the ’70s and ’80s, he accumulated friends who were thinking about the issues he was: how culture is transmitted among human groups and what evolutionary function it might serve. “I started looking at history, and I wondered why no society ever survived more than three generations without a religious foundation as its raison d’être,” he says. Soon he turned to an emerging subset of evolutionary theory — the evolution of human cognition.

Some cognitive scientists think of brain functioning in terms of modules, a series of interconnected machines, each one responsible for a particular mental trick. They do not tend to talk about a God module per se; they usually consider belief in God a consequence of other mental modules.

Religion, in this view, is “a family of cognitive phenomena that involves the extraordinary use of everyday cognitive processes,” Atran wrote in “In Gods We Trust.” “Religions do not exist apart from the individual minds that constitute them and the environments that constrain them, any more than biological species and varieties exist independently of the individual organisms that compose them and the environments that conform them.”

At around the time “In Gods We Trust” appeared five years ago, a handful of other scientists — Pascal Boyer, now at Washington University; Justin Barrett, now at Oxford; Paul Bloom at Yale — were addressing these same questions. In synchrony they were moving toward the byproduct theory.

Darwinians who study physical evolution distinguish between traits that are themselves adaptive, like having blood cells that can transport oxygen, and traits that are byproducts of adaptations, like the redness of blood. There is no survival advantage to blood’s being red instead of turquoise; it is just a byproduct of the trait that is adaptive, having blood that contains hemoglobin.

Something similar explains aspects of brain evolution, too, say the byproduct theorists. Which brings us to the idea of the spandrel.

Stephen Jay Gould, the famed evolutionary biologist at Harvard who died in 2002, and his colleague Richard Lewontin proposed “spandrel” to describe a trait that has no adaptive value of its own. They borrowed the term from architecture, where it originally referred to the V-shaped structure formed between two rounded arches. The structure is not there for any purpose; it is there because that is what happens when arches align.

In architecture, a spandrel can be neutral or it can be made functional. Building a staircase, for instance, creates a space underneath that is innocuous, just a blank sort of triangle. But if you put a closet there, the under-stairs space takes on a function, unrelated to the staircase’s but useful nonetheless. Either way, functional or nonfunctional, the space under the stairs is a spandrel, an unintended byproduct.

“Natural selection made the human brain big,” Gould wrote, “but most of our mental properties and potentials may be spandrels — that is, nonadaptive side consequences of building a device with such structural complexity.”

The possibility that God could be a spandrel offered Atran a new way of understanding the evolution of religion. But a spandrel of what, exactly?

Hardships of early human life favored the evolution of certain cognitive tools, among them the ability to infer the presence of organisms that might do harm, to come up with causal narratives for natural events and to recognize that other people have minds of their own with their own beliefs, desires and intentions. Psychologists call these tools, respectively, agent detection, causal reasoning and theory of mind.

Agent detection evolved because assuming the presence of an agent — which is jargon for any creature with volitional, independent behavior — is more adaptive than assuming its absence. If you are a caveman on the savannah, you are better off presuming that the motion you detect out of the corner of your eye is an agent and something to run from, even if you are wrong. If it turns out to have been just the rustling of leaves, you are still alive; if what you took to be leaves rustling was really a hyena about to pounce, you are dead.

A classic experiment from the 1940s by the psychologists Fritz Heider and Marianne Simmel suggested that imputing agency is so automatic that people may do it even for geometric shapes. For the experiment, subjects watched a film of triangles and circles moving around. When asked what they had been watching, the subjects used words like “chase” and “capture.” They did not just see the random movement of shapes on a screen; they saw pursuit, planning, escape.

So if there is motion just out of our line of sight, we presume it is caused by an agent, an animal or person with the ability to move independently. This usually operates in one direction only; lots of people mistake a rock for a bear, but almost no one mistakes a bear for a rock.

What does this mean for belief in the supernatural? It means our brains are primed for it, ready to presume the presence of agents even when such presence confounds logic. “The most central concepts in religions are related to agents,” Justin Barrett, a psychologist, wrote in his 2004 summary of the byproduct theory, “Why Would Anyone Believe in God?” Religious agents are often supernatural, he wrote, “people with superpowers, statues that can answer requests or disembodied minds that can act on us and the world.”

A second mental module that primes us for religion is causal reasoning. The human brain has evolved the capacity to impose a narrative, complete with chronology and cause-and-effect logic, on whatever it encounters, no matter how apparently random. “We automatically, and often unconsciously, look for an explanation of why things happen to us,” Barrett wrote, “and ‘stuff just happens’ is no explanation. Gods, by virtue of their strange physical properties and their mysterious superpowers, make fine candidates for causes of many of these unusual events.” The ancient Greeks believed thunder was the sound of Zeus’s thunderbolt. Similarly, a contemporary woman whose cancer treatment works despite 10-to-1 odds might look for a story to explain her survival. It fits better with her causal-reasoning tool for her recovery to be a miracle, or a reward for prayer, than for it to be just a lucky roll of the dice.

A third cognitive trick is a kind of social intuition known as theory of mind. It’s an odd phrase for something so automatic, since the word “theory” suggests formality and self-consciousness. Other terms have been used for the same concept, like intentional stance and social cognition. One good alternative is the term Atran uses: folkpsychology.

Folkpsychology, as Atran and his colleagues see it, is essential to getting along in the contemporary world, just as it has been since prehistoric times. It allows us to anticipate the actions of others and to lead others to believe what we want them to believe; it is at the heart of everything from marriage to office politics to poker. People without this trait, like those with severe autism, are impaired, unable to imagine themselves in other people’s heads.

The process begins with positing the existence of minds, our own and others’, that we cannot see or feel. This leaves us open, almost instinctively, to belief in the separation of the body (the visible) and the mind (the invisible). If you can posit minds in other people that you cannot verify empirically, suggests Paul Bloom, a psychologist and the author of “Descartes’ Baby,” published in 2004, it is a short step to positing minds that do not have to be anchored to a body. And from there, he said, it is another short step to positing an immaterial soul and a transcendent God.

The traditional psychological view has been that until about age 4, children think that minds are permeable and that everyone knows whatever the child himself knows. To a young child, everyone is infallible. All other people, especially Mother and Father, are thought to have the same sort of insight as an all-knowing God.

But at a certain point in development, this changes. (Some new research suggests this might occur as early as 15 months.) The “false-belief test” is a classic experiment that highlights the boundary. Children watch a puppet show with a simple plot: John comes onstage holding a marble, puts it in Box A and walks off. Mary comes onstage, opens Box A, takes out the marble, puts it in Box B and walks off. John comes back onstage. The children are asked, Where will John look for the marble?

Very young children, or autistic children of any age, say John will look in Box B, since they know that’s where the marble is. But older children give a more sophisticated answer. They know that John never saw Mary move the marble and that as far as he is concerned it is still where he put it, in Box A. Older children have developed a theory of mind; they understand that other people sometimes have false beliefs. Even though they know that the marble is in Box B, they respond that John will look for it in Box A.

The adaptive advantage of folkpsychology is obvious. According to Atran, our ancestors needed it to survive their harsh environment, since folkpsychology allowed them to “rapidly and economically” distinguish good guys from bad guys. But how did folkpsychology — an understanding of ordinary people’s ordinary minds — allow for a belief in supernatural, omniscient minds? And if the byproduct theorists are right and these beliefs were of little use in finding food or leaving more offspring, why did they persist?

Atran ascribes the persistence to evolutionary misdirection, which, he says, happens all the time: “Evolution always produces something that works for what it works for, and then there’s no control for however else it’s used.” On a sunny weekday morning, over breakfast at a French cafe on upper Broadway, he tried to think of an analogy and grinned when he came up with an old standby: women’s breasts. Because they are associated with female hormones, he explained, full breasts indicate a woman is fertile, and the evolution of the male brain’s preference for them was a clever mating strategy. But breasts are now used for purposes unrelated to reproduction, to sell anything from deodorant to beer. “A Martian anthropologist might look at this and say, ‘Oh, yes, so these breasts must have somehow evolved to sell hygienic stuff or food to human beings,’ ” Atran said. But the Martian would, of course, be wrong. Equally wrong would be to make the same mistake about religion, thinking it must have evolved to make people behave a certain way or feel a certain allegiance.

That is what most fascinated Atran. “Why is God in there?” he wondered.

The idea of an infallible God is comfortable and familiar, something children readily accept. You can see this in the experiment Justin Barrett conducted recently — a version of the traditional false-belief test but with a religious twist. Barrett showed young children a box with a picture of crackers on the outside. What do you think is inside this box? he asked, and the children said, “Crackers.” Next he opened it and showed them that the box was filled with rocks. Then he asked two follow-up questions: What would your mother say is inside this box? And what would God say?

As earlier theory-of-mind experiments already showed, 3- and 4-year-olds tended to think Mother was infallible, and since the children knew the right answer, they assumed she would know it, too. They usually responded that Mother would say the box contained rocks. But 5- and 6-year-olds had learned that Mother, like any other person, could hold a false belief in her mind, and they tended to respond that she would be fooled by the packaging and would say, “Crackers.”

And what would God say? No matter what their age, the children, who were all Protestants, told Barrett that God would answer, “Rocks.” This was true even for the older children, who, as Barrett understood it, had developed folkpsychology and had used it when predicting a wrong response for Mother. They had learned that, in certain situations, people could be fooled — but they had also learned that there is no fooling God.

The bottom line, according to byproduct theorists, is that children are born with a tendency to believe in omniscience, invisible minds, immaterial souls — and then they grow up in cultures that fill their minds, hard-wired for belief, with specifics. It is a little like language acquisition, Paul Bloom says, with the essential difference that language is a biological adaptation and religion, in his view, is not. We are born with an innate facility for language but the specific language we learn depends on the environment in which we are raised. In much the same way, he says, we are born with an innate tendency for belief, but the specifics of what we grow up believing — whether there is one God or many, whether the soul goes to heaven or occupies another animal after death — are culturally shaped.

Whatever the specifics, certain beliefs can be found in all religions. Those that prevail, according to the byproduct theorists, are those that fit most comfortably with our mental architecture. Psychologists have shown, for instance, that people attend to, and remember, things that are unfamiliar and strange, but not so strange as to be impossible to assimilate. Ideas about God or other supernatural agents tend to fit these criteria. They are what Pascal Boyer, an anthropologist and psychologist, called “minimally counterintuitive”: weird enough to get your attention and lodge in your memory but not so weird that you reject them altogether. A tree that talks is minimally counterintuitive, and you might believe it as a supernatural agent. A tree that talks and flies and time-travels is maximally counterintuitive, and you are more likely to reject it.

Atran, along with Ara Norenzayan of the University of British Columbia, studied the idea of minimally counterintuitive agents earlier this decade. They presented college students with lists of fantastical creatures and asked them to choose the ones that seemed most “religious.” The convincingly religious agents, the students said, were not the most outlandish — not the turtle that chatters and climbs or the squealing, flowering marble — but those that were just outlandish enough: giggling seaweed, a sobbing oak, a talking horse. Giggling seaweed meets the requirement of being minimally counterintuitive, Atran wrote. So does a God who has a human personality except that he knows everything or a God who has a mind but has no body.

It is not enough for an agent to be minimally counterintuitive for it to earn a spot in people’s belief systems. An emotional component is often needed, too, if belief is to take hold. “If your emotions are involved, then that’s the time when you’re most likely to believe whatever the religion tells you to believe,” Atran says. Religions stir up emotions through their rituals — swaying, singing, bowing in unison during group prayer, sometimes working people up to a state of physical arousal that can border on frenzy. And religions gain strength during the natural heightening of emotions that occurs in times of personal crisis, when the faithful often turn to shamans or priests. The most intense personal crisis, for which religion can offer powerfully comforting answers, is when someone comes face to face with mortality.

In John Updike’s celebrated early short story “Pigeon Feathers,” 14-year-old David spends a lot of time thinking about death. He suspects that adults are lying when they say his spirit will live on after he dies. He keeps catching them in inconsistencies when he asks where exactly his soul will spend eternity. “Don’t you see,” he cries to his mother, “if when we die there’s nothing, all your sun and fields and what not are all, ah, horror? It’s just an ocean of horror.”

The story ends with David’s tiny revelation and his boundless relief. The boy gets a gun for his 15th birthday, which he uses to shoot down some pigeons that have been nesting in his grandmother’s barn. Before he buries them, he studies the dead birds’ feathers. He is amazed by their swirls of color, “designs executed, it seemed, in a controlled rapture.” And suddenly the fears that have plagued him are lifted, and with a “slipping sensation along his nerves that seemed to give the air hands, he was robed in this certainty: that the God who had lavished such craft upon these worthless birds would not destroy His whole Creation by refusing to let David live forever.”

Fear of death is an undercurrent of belief. The spirits of dead ancestors, ghosts, immortal deities, heaven and hell, the everlasting soul: the notion of spiritual existence after death is at the heart of almost every religion. According to some adaptationists, this is part of religion’s role, to help humans deal with the grim certainty of death. Believing in God and the afterlife, they say, is how we make sense of the brevity of our time on earth, how we give meaning to this brutish and short existence. Religion can offer solace to the bereaved and comfort to the frightened.

But the spandrelists counter that saying these beliefs are consolation does not mean they offered an adaptive advantage to our ancestors. “The human mind does not produce adequate comforting delusions against all situations of stress or fear,” wrote Pascal Boyer, a leading byproduct theorist, in “Religion Explained,” which came out a year before Atran’s book. “Indeed, any organism that was prone to such delusions would not survive long.”

Whether or not it is adaptive, belief in the afterlife gains power in two ways: from the intensity with which people wish it to be true and from the confirmation it seems to get from the real world. This brings us back to folkpsychology. We try to make sense of other people partly by imagining what it is like to be them, an adaptive trait that allowed our ancestors to outwit potential enemies. But when we think about being dead, we run into a cognitive wall. How can we possibly think about not thinking? “Try to fill your consciousness with the representation of no-consciousness, and you will see the impossibility of it,” the Spanish philosopher Miguel de Unamuno wrote in “Tragic Sense of Life.” “The effort to comprehend it causes the most tormenting dizziness. We cannot conceive of ourselves as not existing.”

Much easier, then, to imagine that the thinking somehow continues. This is what young children seem to do, as a study at the Florida Atlantic University demonstrated a few years ago. Jesse Bering and David Bjorklund, the psychologists who conducted the study, used finger puppets to act out the story of a mouse, hungry and lost, who is spotted by an alligator. “Well, it looks like Brown Mouse got eaten by Mr. Alligator,” the narrator says at the end. “Brown Mouse is not alive anymore.”

Afterward, Bering and Bjorklund asked their subjects, ages 4 to 12, what it meant for Brown Mouse to be “not alive anymore.” Is he still hungry? Is he still sleepy? Does he still want to go home? Most said the mouse no longer needed to eat or drink. But a large proportion, especially the younger ones, said that he still had thoughts, still loved his mother and still liked cheese. The children understood what it meant for the mouse’s body to cease to function, but many believed that something about the mouse was still alive.

“Our psychological architecture makes us think in particular ways,” says Bering, now at Queens University in Belfast, Northern Ireland. “In this study, it seems, the reason afterlife beliefs are so prevalent is that underlying them is our inability to simulate our nonexistence.”

It might be just as impossible to simulate the nonexistence of loved ones. A large part of any relationship takes place in our minds, Bering said, so it’s natural for it to continue much as before after the other person’s death. It is easy to forget that your sister is dead when you reach for the phone to call her, since your relationship was based so much on memory and imagined conversations even when she was alive. In addition, our agent-detection device sometimes confirms the sensation that the dead are still with us. The wind brushes our cheek, a spectral shape somehow looks familiar and our agent detection goes into overdrive. Dreams, too, have a way of confirming belief in the afterlife, with dead relatives appearing in dreams as if from beyond the grave, seeming very much alive.

Belief is our fallback position, according to Bering; it is our reflexive style of thought. “We have a basic psychological capacity that allows anyone to reason about unexpected natural events, to see deeper meaning where there is none,” he says. “It’s natural; it’s how our minds work.”

Intriguing as the spandrel logic might be, there is another way to think about the evolution of religion: that religion evolved because it offered survival advantages to our distant ancestors. This is where the action is in the science of God debate, with a coterie of adaptationists arguing on behalf of the primary benefits, in terms of survival advantages, of religious belief.

The trick in thinking about adaptation is that even if a trait offers no survival advantage today, it might have had one long ago. This is how Darwinians explain how certain physical characteristics persist even if they do not currently seem adaptive — by asking whether they might have helped our distant ancestors form social groups, feed themselves, find suitable mates or keep from getting killed. A facility for storing calories as fat, for instance, which is a detriment in today’s food-rich society, probably helped our ancestors survive cyclical famines.

So trying to explain the adaptiveness of religion means looking for how it might have helped early humans survive and reproduce. As some adaptationists see it, this could have worked on two levels, individual and group. Religion made people feel better, less tormented by thoughts about death, more focused on the future, more willing to take care of themselves. As William James put it, religion filled people with “a new zest which adds itself like a gift to life . . . an assurance of safety and a temper of peace and, in relation to others, a preponderance of loving affections.”

Such sentiments, some adaptationists say, made the faithful better at finding and storing food, for instance, and helped them attract better mates because of their reputations for morality, obedience and sober living. The advantage might have worked at the group level too, with religious groups outlasting others because they were more cohesive, more likely to contain individuals willing to make sacrifices for the group and more adept at sharing resources and preparing for warfare.

One of the most vocal adaptationists is David Sloan Wilson, an occasional thorn in the side of both Scott Atran and Richard Dawkins. Wilson, an evolutionary biologist at the State University of New York at Binghamton, focuses much of his argument at the group level. “Organisms are a product of natural selection,” he wrote in “Darwin’s Cathedral: Evolution, Religion, and the Nature of Society,” which came out in 2002, the same year as Atran’s book, and staked out the adaptationist view. “Through countless generations of variation and selection, [organisms] acquire properties that enable them to survive and reproduce in their environments. My purpose is to see if human groups in general, and religious groups in particular, qualify as organismic in this sense.”

Wilson’s father was Sloan Wilson, author of “The Man in the Gray Flannel Suit,” an emblem of mid-’50s suburban anomie that was turned into a film starring Gregory Peck. Sloan Wilson became a celebrity, with young women asking for his autograph, especially after his next novel, “A Summer Place,” became another blockbuster movie. The son grew up wanting to do something to make his famous father proud.

“I knew I couldn’t be a novelist,” said Wilson, who crackled with intensity during a telephone interview, “so I chose something as far as possible from literature — I chose science.” He is disarmingly honest about what motivated him: “I was very ambitious, and I wanted to make a mark.” He chose to study human evolution, he said, in part because he had some of his father’s literary leanings and the field required a novelist’s attention to human motivations, struggles and alliances — as well as a novelist’s flair for narrative.

Wilson eventually chose to study religion not because religion mattered to him personally — he was raised in a secular Protestant household and says he has long been an atheist — but because it was a lens through which to look at and revivify a branch of evolution that had fallen into disrepute. When Wilson was a graduate student at Michigan State University in the 1970s, Darwinians were critical of group selection, the idea that human groups can function as single organisms the way beehives or anthills do. So he decided to become the man who rescued this discredited idea. “I thought, Wow, defending group selection — now, that would be big,” he recalled. It wasn’t until the 1990s, he said, that he realized that “religion offered an opportunity to show that group selection was right after all.”

Dawkins once called Wilson’s defense of group selection “sheer, wanton, head-in-bag perversity.” Atran, too, has been dismissive of this approach, calling it “mind blind” for essentially ignoring the role of the brain’s mental machinery. The adaptationists “cannot in principle distinguish Marxism from monotheism, ideology from religious belief,” Atran wrote. “They cannot explain why people can be more steadfast in their commitment to admittedly counterfactual and counterintuitive beliefs — that Mary is both a mother and a virgin, and God is sentient but bodiless — than to the most politically, economically or scientifically persuasive account of the way things are or should be.”

Still, for all its controversial elements, the narrative Wilson devised about group selection and the evolution of religion is clear, perhaps a legacy of his novelist father. Begin, he says, with an imaginary flock of birds. Some birds serve as sentries, scanning the horizon for predators and calling out warnings. Having a sentry is good for the group but bad for the sentry, which is doubly harmed: by keeping watch, the sentry has less time to gather food, and by issuing a warning call, it is more likely to be spotted by the predator. So in the Darwinian struggle, the birds most likely to pass on their genes are the nonsentries. How, then, could the sentry gene survive for more than a generation or two?

To explain how a self-sacrificing gene can persist, Wilson looks to the level of the group. If there are 10 sentries in one group and none in the other, 3 or 4 of the sentries might be sacrificed. But the flock with sentries will probably outlast the flock that has no early-warning system, so the other 6 or 7 sentries will survive to pass on the genes. In other words, if the whole-group advantage outweighs the cost to any individual bird of being a sentry, then the sentry gene will prevail.

There are costs to any individual of being religious: the time and resources spent on rituals, the psychic energy devoted to following certain injunctions, the pain of some initiation rites. But in terms of intergroup struggle, according to Wilson, the costs can be outweighed by the benefits of being in a cohesive group that out-competes the others.

There is another element here too, unique to humans because it depends on language. A person’s behavior is observed not only by those in his immediate surroundings but also by anyone who can hear about it. There might be clear costs to taking on a role analogous to the sentry bird — a person who stands up to authority, for instance, risks losing his job, going to jail or getting beaten by the police — but in humans, these local costs might be outweighed by long-distance benefits. If a particular selfless trait enhances a person’s reputation, spread through the written and spoken word, it might give him an advantage in many of life’s challenges, like finding a mate. One way that reputation is enhanced is by being ostentatiously religious.

“The study of evolution is largely the study of trade-offs,” Wilson wrote in “Darwin’s Cathedral.” It might seem disadvantageous, in terms of foraging for sustenance and safety, for someone to favor religious over rationalistic explanations that would point to where the food and danger are. But in some circumstances, he wrote, “a symbolic belief system that departs from factual reality fares better.” For the individual, it might be more adaptive to have “highly sophisticated mental modules for acquiring factual knowledge and for building symbolic belief systems” than to have only one or the other, according to Wilson. For the group, it might be that a mixture of hardheaded realists and symbolically minded visionaries is most adaptive and that “what seems to be an adversarial relationship” between theists and atheists within a community is really a division of cognitive labor that “keeps social groups as a whole on an even keel.”

Even if Wilson is right that religion enhances group fitness, the question remains: Where does God come in? Why is a religious group any different from groups for which a fitness argument is never even offered — a group of fraternity brothers, say, or Yankees fans?

Richard Sosis, an anthropologist with positions at the University of Connecticut and Hebrew University of Jerusalem, has suggested a partial answer. Like many adaptationists, Sosis focuses on the way religion might be adaptive at the individual level. But even adaptations that help an individual survive can sometimes play themselves out through the group. Consider religious rituals.

“Religious and secular rituals can both promote cooperation,” Sosis wrote in American Scientist in 2004. But religious rituals “generate greater belief and commitment” because they depend on belief rather than on proof. The rituals are “beyond the possibility of examination,” he wrote, and a commitment to them is therefore emotional rather than logical — a commitment that is, in Sosis’s view, deeper and more long-lasting.

Rituals are a way of signaling a sincere commitment to the religion’s core beliefs, thereby earning loyalty from others in the group. “By donning several layers of clothing and standing out in the midday sun,” Sosis wrote, “ultraorthodox Jewish men are signaling to others: ‘Hey! Look, I’m a haredi’ — or extremely pious — ‘Jew. If you are also a member of this group, you can trust me because why else would I be dressed like this?’ ” These “signaling” rituals can grant the individual a sense of belonging and grant the group some freedom from constant and costly monitoring to ensure that their members are loyal and committed. The rituals are harsh enough to weed out the infidels, and both the group and the individual believers benefit.

In 2003, Sosis and Bradley Ruffle of Ben Gurion University in Israel sought an explanation for why Israel’s religious communes did better on average than secular communes in the wake of the economic crash of most of the country’s kibbutzim. They based their study on a standard economic game that measures cooperation. Individuals from religious communes played the game more cooperatively, while those from secular communes tended to be more selfish. It was the men who attended synagogue daily, not the religious women or the less observant men, who showed the biggest differences. To Sosis, this suggested that what mattered most was the frequent public display of devotion. These rituals, he wrote, led to greater cooperation in the religious communes, which helped them maintain their communal structure during economic hard times.

In 1997, Stephen Jay Gould wrote an essay in Natural History that called for a truce between religion and science. “The net of science covers the empirical universe,” he wrote. “The net of religion extends over questions of moral meaning and value.” Gould was emphatic about keeping the domains separate, urging “respectful discourse” and “mutual humility.” He called the demarcation “nonoverlapping magisteria” from the Latin magister, meaning “canon.”

Richard Dawkins had a history of spirited arguments with Gould, with whom he disagreed about almost everything related to the timing and focus of evolution. But he reserved some of his most venomous words for nonoverlapping magisteria. “Gould carried the art of bending over backward to positively supine lengths,” he wrote in “The God Delusion.” “Why shouldn’t we comment on God, as scientists? . . . A universe with a creative superintendent would be a very different kind of universe from one without. Why is that not a scientific matter?”

The separation, other critics said, left untapped the potential richness of letting one worldview inform the other. “Even if Gould was right that there were two domains, what religion does and what science does,” says Daniel Dennett (who, despite his neo-atheist label, is not as bluntly antireligious as Dawkins and Harris are), “that doesn’t mean science can’t study what religion does. It just means science can’t do what religion does.”

The idea that religion can be studied as a natural phenomenon might seem to require an atheistic philosophy as a starting point. Not necessarily. Even some neo-atheists aren’t entirely opposed to religion. Sam Harris practices Buddhist-inspired meditation. Daniel Dennett holds an annual Christmas sing-along, complete with hymns and carols that are not only harmonically lush but explicitly pious.

And one prominent member of the byproduct camp, Justin Barrett, is an observant Christian who believes in “an all-knowing, all-powerful, perfectly good God who brought the universe into being,” as he wrote in an e-mail message. “I believe that the purpose for people is to love God and love each other.”

At first blush, Barrett’s faith might seem confusing. How does his view of God as a byproduct of our mental architecture coexist with his Christianity? Why doesn’t the byproduct theory turn him into a skeptic?

“Christian theology teaches that people were crafted by God to be in a loving relationship with him and other people,” Barrett wrote in his e-mail message. “Why wouldn’t God, then, design us in such a way as to find belief in divinity quite natural?” Having a scientific explanation for mental phenomena does not mean we should stop believing in them, he wrote. “Suppose science produces a convincing account for why I think my wife loves me — should I then stop believing that she does?”

What can be made of atheists, then? If the evolutionary view of religion is true, they have to work hard at being atheists, to resist slipping into intrinsic habits of mind that make it easier to believe than not to believe. Atran says he faces an emotional and intellectual struggle to live without God in a nonatheist world, and he suspects that is where his little superstitions come from, his passing thought about crossing his fingers during turbulence or knocking on wood just in case. It is like an atavistic theism erupting when his guard is down. The comforts and consolations of belief are alluring even to him, he says, and probably will become more so as he gets closer to the end of his life. He fights it because he is a scientist and holds the values of rationalism higher than the values of spiritualism.

This internal push and pull between the spiritual and the rational reflects what used to be called the “God of the gaps” view of religion. The presumption was that as science was able to answer more questions about the natural world, God would be invoked to answer fewer, and religion would eventually recede. Research about the evolution of religion suggests otherwise. No matter how much science can explain, it seems, the real gap that God fills is an emptiness that our big-brained mental architecture interprets as a yearning for the supernatural. The drive to satisfy that yearning, according to both adaptationists and byproduct theorists, might be an inevitable and eternal part of what Atran calls the tragedy of human cognition.

Robin Marantz Henig, a contributing writer, has written recently for the magazine about the neurobiology of lying and about obesity.