Friday, May 29, 2009

An Alternative to Materialism's Funeral?: The Frankfurt School Option


After the critical setbacks of World War One, the international Marxist community experienced a crisis - an existential crisis that forced them to confront the validity of their entire worldview, and which sounds ominously like the crisis materialism is experiencing today)…

To quote Pat Buchanan in his book, The Death of the West:

“Nothing the Marxists had predicted had come to pass. Their hour had come and gone. The workers of the West, the mythical proletariat, had refused to play the role history had assigned them. How could Marx have been so wrong?”

To address that question, “In 1923, [Georg] Lukacs and members of the German Communist party set up, at Frankfurt University, an Institute for Marxism…It would soon come to be known simply as the Frankfurt School….The Frankfurt School began to retranslate Marxism into cultural terms….And their ideas have triumphed.”

What was their main idea?

You may have heard of it.

“False consciousness.”

Before explaining “false consciousness” (and what it means for materialism), here's some background

In the 19th century, Karl Marx “offered an objective theory of class, based on an analysis of the objective features of the system of economic relations that constitute the social order. A person's social class is determined by his or her position within the system of property relations that constitutes a given economic society…. People also have subjective characteristics: thoughts, mental frameworks, and identities. These mental constructs give the person a cognitive framework in terms of which the person understands his or her role in the world and the forces that govern his or her life.

Social mechanisms emerge in class society that systematically create distortions, errors, and blind spots in the consciousness of the underclass. If these consciousness-shaping mechanisms did not exist, then the underclass, always a majority, would quickly overthrow the system of their domination. So the institutions that shape the person’s thoughts, ideas, and frameworks develop in such a way as to generate false consciousness and ideology.”

While it’s true that Marx invented the overall concept of “false consciousness,” he didn’t invent that exact term (his partner, Frederic Engels, deserves the credit for that), and neither Marx nor Engels gave the idea of "false consciousness" much importance.

In the 20th century, however, the Frankfurt School elevated “false consciousness” to the penthouse of Marxist doctrine.

Even today – 20 years after the fall of the Berlin Wall – the idea of “false consciousness” permeates our culture. Political leftists continue to insist that people who disagree with them but SHOULD AGREE with them (for example, white working class families) have been blinded to their true class interests by “superstructures” such as Christianity, capitalism, etc.

In other words, YOU (the “common people”) are NOT rational, because if you were, you would agree with us, and since you don’t agree with us, we must explain to you WHY you are irrational. Once we do that, we will neutralize your credibility to wage an argument, and finally, defeat you by default.

For a fuller explanation of this theory and strategy, see Thomas Frank’s best-selling book What’s the Matter with Kansas?: How Conservatives Won the Heart of America.

To quote Pat Buchanan again: “America’s elites, who may not even know today who the Frankfurt thinkers were, have taken to their ideas like catnip.”

Still with me?

Sorry for all the retro Marxist mumbo-jumbo.

Now to the main event…

As I’ve mentioned before (here, here, and here), the reductionists aren’t just aggressively protecting their turf when it comes to Darwinism (using the courts to squelch Intelligent Design, for example), they are also going on offense when it comes to neurobiology – using the media to champion the idea that free will is an “illusion” and that our behavior is determined by a proper mix of genes and environmental factors.

There’s a lot of reasons for this new offensive, but it’s probably likely that on some level – at least beneath the surface – one of those reasons is to deflect attention away from the compelling logic of Intelligent Design by…wait for it…denying the power of logic itself!

In the same that the Frankfurt School reacted to the failure of Marxism (by blaming the “distortions, errors, and blind spots in the consciousness of the underclass”), the Reductionists are reacting to their failure to earn the conviction of the American people (see these polls), by reveling in the idea that people themselves CAN'T make rational convictions – that “people often have little or no information about the real causes of their own behavior” and that “it has been hard to find any correlation between moral reasoning and proactive moral behavior.”

In other words, if you don’t agree with the Reductionists, it’s because your rational capacities are really a disguise created by your emotions, and you suffer from “false consciousness.”

On an intellectual level, the Frankfurt School kept the spirit of Marxism alive for decades after it became clear that the economic underpinning of Marxism (which was the original inspiration of Marxism) had been proven false by their undisputed failure in the Communist Bloc.

Subtly, this new “Frankfurt School” of Reductionism is hoping to pick up where the old cultural Marxists left off –
by keeping the game going a little while longer – not by waging an objective battle for truth, but by denying Man’s ability to know Truth altogether.

-Todd

**UPDATE, JUNE 2, 2009**

In case you thought I was exaggerating, take a look at this article, Childhood Origins of Adult Resistance to Science. This piece gives new meaning to the term "reductionism."

From the abstract:

"Resistance to certain scientific ideas derives in large part from assumptions and biases that can be demonstrated experimentally in young children and that may persist into adulthood.

In particular, both adults and children resist acquiring scientific information that clashes with common-sense intuitions about the physical and psychological domains.


Additionally, when learning information from other people, both adults and children are sensitive to the trustworthiness of the source of that information.

Resistance to science, then, is particularly exaggerated in societies where nonscientific ideologies have the advantages of being both grounded in common sense and transmitted by trustworthy sources."

Ah...so from the Darwinist perspective, people who disagree with the "reductionist paradigm" are stuck in childish ways of thinking...how mature!

-Todd

**UPDATE, JUNE 3, 2009**

As the old saying goes, “those who live by the sword die by the sword.” According to this article, scientists have to overcome their own “childish” mindsets.

Religion among Academic Scientists: Distinctions, Disciplines, and Demographics

Abstract: The religiosity of scientists is a persistent topic of interest and debate among both popular and academic commentators. Researchers look to this population as a case study for understanding the intellectual tensions between religion and science and the possible secularizing effects of education…

Using data from a recent survey of academic scientists at twenty-one elite U.S. research universities, we compare the religious beliefs and practices of natural and social scientists within seven disciplines as well as academic scientists to the general population.

We find that field-specific and interdisciplinary differences are not as significant in predicting religiosity as other research suggests. Instead, demographic factors such as age, marital status, and presence of children in the household are the strongest predictors of religious difference among scientists. In particular, religiosity in the home as a child is the most important predictor of present religiosity among this group of scientists.

So...if both scientists AND believers are prisoners of their “childish” mindsets, who then is qualified to discuss these critical questions of existence? Apparently, no one.

Luckily, we have an alternative: We can stop psychoanalyzing our opponents, recommit to logic, and follow the evidence wherever it leads. Of course, the Darwinbots might oppose that idea too - because they know the evidence does not lead in their direction.

-Todd


"The Great Relearning"




In my humble opinion, best-selling author Tom Wolfe is one of the most insightful critics of contemporary American life. Even when I don’t agree with him (see this article), Wolfe always has plenty of interesting things to say which stimulate new ideas, visions, and theories. His latest novel, I am Charlotte Simmons, is probably the second-best novel I’ve ever read (after Atlas Shrugged).


But I digress…


In 1987, Mr. Wolfe wrote an article for the American Spectator called, Brave New World Bites the Dust (unfortunately, it isn’t available online). The thrust of Mr. Wolfe’s piece is that American culture – after surviving the transformational shocks of the 1960s and 70s - is experiencing a social conservative awakening; what he calls a “Great Relearning.”


The hippies, as they became known, sought nothing less than to sweep aside all codes and restraints of the past and start out from zero. At one point Ken Kesey organized a pilgrimage to Stonehenge with the idea of returning to our civilization's point zero, which he figured was Stonehenge, and heading out all over again to do it better.


In politics, the twentieth century's great start from zero was one-party socialism, also known as communism or Marxism-Lenin-ism.


Today the relearning has reached the point that even ruling circles in the Soviet Union and China have begun to wonder how best to convert communism into something other than, in Susan Sontag's phrase, successful fascism. The great U.S. contribution to the twentienth century's start from zero was in the area of manners and mores, especially in what was rather primly called 'the sexual revolution.' In every U.S. hamlet, even in the erstwhile Bible Belt, may be found the village brothel, no longer hidden in a house of blue lights or red lights behind a green door but openly advertised by the side of the road with a 1,000-watt back-lit plastic sign: Totally All-Nude Girl Sauna Massage And Marathon Encounter Sessions Inside.


The great U.S. contribution to the twentienth century's start from zero was in the area of manners and mores, especially in what was rather primly called 'the sexual revolution.' In every U.S. hamlet, even in the erstwhile Bible Belt, may be found the village brothel, no longer hidden in a house of blue lights or red lights behind a green door but openly advertised by the side of the road with a 1,000-watt back-lit plastic sign: Totally All-Nude Girl Sauna Massage And Marathon Encounter Sessions Inside.


But in the sexual revolution, too, the painful dawn has already arrived, and the relearning is imminent. All may be summed up in a single term, requiring no amplification: AIDS.

The concept of a “Great Relearning” was taken up and expanded by several commentators, including Daniel Yankelovich (see his speech, “Lurch and Learn,” delivered in 1997), Francis Fukuyama, in his excellent book, The Great Disruption (published in 1999), and David Frum, in his equally fine, How We Got Here: The 70’s--The Decade that Brought You Modern Life--For Better or Worse (published in 2000).


From Yankelovich’s speech…


Towards the end of the 1960s, our market research studies were picking up more and more reverberations of the nation's changing values…Across the board, in life insurance, cars, foods, cosmetics, housing, women's clothing and other products and services, the new values began to express themselves both in consumer behavior and on the societal front.


These value changes were strongest among young people, creating what was famously called a "generation gap."


By the end of the decade the enormity of the change was unmistakable.


Our research uncovered several extraordinary discontinuities in values which we later came to think of as "lurches." There were two such lurches that later turned out to be closely related. One was a lurch from a depression psychology to a psychology of affluence. The depression psychology had taken hold in the 1930s and, remarkably, had persisted throughout the 1950s and early 1960s, even when the economy was steadily improving. Indeed, it wasn't until the middle of the 1960s that people began to think, "Well, maybe these good times will continue. Maybe they will really last." Almost overnight, people shifted from the conviction that "it won't last and we had better save every penny" to the conviction that "it's here forever; there's no limit to what we can now do with all this new affluence.


The other lurch was, as stated earlier, a shift from automatic sacrifice for the family to a questioning of the need for sacrifice in an affluent society. Unfortunately, questioning the desirability of unnecessary sacrifice in the 1960s and 1970s evolved in the 1980s into questioning the desirability of any sacrifice whatever. We have been struggling with the consequences of this shift ever since.


Yankelovich then goes on to say…

We have come to the conclusion that the theory that best accounts for the discontinuities, the seeming contradictions, and the odd patterns of movement in the tracking data we have been collecting over the past 35 years is a theory we call "lurch-and-learn." It is a pattern that starts with a sharp discontinuity, often a reversal (a lurch), which is then followed by a complex series of modifications based on social learning, some of which are valid and some of which are false learning…


Society's lurch and learn process is far more mistake-prone than individual learning. Society's lurches can lead to serious mistakes before corrective learning takes hold. We have developed some useful insights into the kind of learning that occurs in the lurch phase. When people find themselves in the full heat of a reactive lurch, their mood is unstable. Learning occurs exclusively in the direction of the lurch. In the lurch phase, people are quite error prone because of their strong emotions.


People don't like to change. They don't like to admit to mistakes. They don't like to be reasonable when they are frustrated. So it takes a very long time for post-lurch learning to take place. In brief, what we are living through right now in the late 1990s are the results of both valid and false learning from lurches in value changes that took place in the 1960s, 1970s and 1980s. It has taken us a long time to deal with mistakes made in that period, and we still have a long way to go.


This is fascinating stuff. I share the opinion that social mores have experienced a two-pronged phase of “lurch and learn,” and here in 2009, we are probably in the final phases of a two-decade “Great Re-Learning,” which encompasses the initial “Lurch” part (in the 60s and 70s) and the “Learn” part (the 80s, 90s, and 2000s).


So what does all this have to do with The Mustard Seed?


Well, first of all, I think these concepts put a lot of meat on my May 12 post: "Will the Economic Crisis Inflame the Culture Wars or End Them?"


While none of the authors say this directly (after all, they were talking about event prior to 2000), we might infer that today – in the year 2009 - the current “Lurch and Learn/Great Relearning” may be nearing completion, and that we’re on the cusp of a new “Lurch/Learn” cycle (the content of which we can hardly know).


But even more grandly…can we entertain the idea that perhaps Science and Religion are experiencing their own “Lurch and Learn?”


If we are so bold…let’s mark the beginning of the “Lurch phase” of aggressive, militant Christianity in 325 A.D. with the Council of Nicea…then let’s begin the “Learn” phase with the Renaissance of the 1500s.


Then began the Scientific Revolution.


The “Lurch” phase can be dated to Darwin’s unveiling of his theory of “natural selection” in 1859…and the “Learn” phase could be dated to Brandon Carter’s articulation of the “Anthropic Principle” in 1973.


To help facilitate this “Great Relearning” in the scientific community, Christians can promote “scientific concepts” which support religion such as Intelligent Design.


As I wrote on Mar. 23: “History has an odd habit of repeating itself. But never in the way you expect.”


-Todd


Weekly Wrap-Up


Dr. Francis Collins, author of the best-selling, but ultimately self-defeating book, The Language of God, is expected to be President Obama’s nominee to head the National Institutes of Health. A cliche choice for a cliche President.


A review of Niles Eldredge's book, The Triumph of Evolution and the Failure of Creationism: "Where is the experimental evidence? None exists in the literature claiming that one species has been shown to evolve into another. Bacteria, the simplest form of independent life, are ideal for this kind of study, with generation times of 20 to 30 minutes, and populations achieved after 18 hours. But throughout 150 years of the science of bacteriology, there is no evidence that one species of bacteria has changed into another, in spite of the fact that populations have been exposed to potent chemical and physical mutagens and that, uniquely, bacteria possess extrachromosomal, transmissible plasmids. Since there is no evidence for species changes between the simplest forms of unicellular life, it is not surprising that there is no evidence for evolution from prokaryotic to eukaryotic cells, let alone throughout the whole array of higher multicellular organisms."


Denyse O'Leary Coins A Great Term: "Darwinbots." "I call the outraged Darwinists “Darwinbots” because I’ll bet that most of them have never seen The Privileged Planet or considered grappling with the questions it raises about design and purpose in the universe.Einstein and Heisenberg grappled with these questions, but the matters that torment great thinkers are beneath the notice of Darwinbots, whose program does not, so far as I can see, contain an independent thinking module."


Denyse O'Leary on the Motivation Behind "Evolutionary Psychology:" The Darwinist "cannot accept that humans actually have consciousness or free will, or that our current circumstances largely result from the exercise of these functions. Rather, he needs to find the answers in the unthinking behavior of non-humans and pre-humans...Any explanation of that sort, no matter how ridiculous, will always make more sense to him than any explanation based on the effects of intelligence, as a creative force in its own right."


Denyse O'Leary on the Link between Darwinism and Marxism: "I don’t see my blog as an exercise in Christian apologetics...I am fascinated by the decline of materialism, typified by widespread public scoffing at its creation story, Darwinism. Had I been born twenty years earlier, I might have covered the decline of Marxism in the same intrigued way, but that show was largely over by the mid-Eighties." For my take on this subject, see here and here.


Thursday, May 28, 2009

The Fundamental Flaw with I.D. (and How to Fix It)


Buried in the text of this American Chronicle article about the I.D. controversy are a few intriguing quotes which - in my view - expose one of the major flaws in the I.D. movement.

The author of the article, Kazmer Ujvarosy, interviews Mike Keas, who teaches in the Master of Arts program in Science and Religion at Biola University and is a Senior Fellow at the Discovery Institute's Center for Science and Culture.

According to Mike...

"ID theory today does not invoke a supernatural form of intelligent agency. Mere intelligent causation is the defining inference of 21st-century ID theory. The exact character of that intelligence, ID advocates insist, is not a scientific question.”

Then later, Mike says...

"Whether the designing intelligence is human, alien, the mind of a living universe, or an intelligent being beyond the cosmos is not an issue that can be effectively addressed through scientific inquiry. Other academic disciplines are better suited to plumb the depths of such fascinating questions.”

This leaves Kazmer to ponder...

"So we are told by ID theorists in no uncertain terms not to depend on them for illumination regarding the inferred intelligence’s exact identity. ID is strictly about design detection. It’s outside the scope of ID to speculate about the intelligence responsible for design in nature, or about the methods of the designer. This leaves us in the dark, and we have no choice but to cast light on the inferred intelligence’s identity without the help of ID theorists."

I think this is an important dialogue which reveals the self-imposed limits of the I.D. movement - needlessly crippling its ability to jump from an intriguing news item to a major challenge to the current scientific/social paradigm.

In Chapter 9 of The Mustard Seed, Heather Manning elaborates...

"This paradigm shift - as remarkable as it is - will be, for most people, nothing more than a curiosity unless it is accompanied by a personal life philosophy - a new approach to living in light of this revolutionary information. That's why I've constructed this idea of 'spiritual rationalism,' which includes the concepts of self-interest, integrity, and love - everything I've explained here today."


Bottom line: The scope of I.D. needs to expand beyond the question of "Is there a Designer?" and examine the following questions: "What is the nature of that Designer?" and then, "What does that mean for the individual and society?"

As it stands today...if a person reads a few stories about I.D., and only retains the following message: "Don't bury God yet; there's still a good chance He might exist," then the potential growth of I.D. itself is limited for 2 reasons.

First, at the risk of pointing the obvious, the vast majority of people in this country are Christians.

Whether or not they are serious enough about their faith to earn that title is another story, but in their own minds, they are sincere Christians. And therefore, while the I.D. story is intriguing, it is ultimately irrelevant. They know the Bible is true. They knew that before I.D. And, should I.D. ever be disproven, they will know that AFTER I.D.

Second, that message will not penetrate the small (but rapidly growing) segment of the population (especially among young people) who are agnostic (but open to religion - as long as it's not Christian fundamentalism.

Why? Because they've already had their fill of Christian fundamentalism, and they don't care for it much. If the message of I.D. is "there might be a God," and their only conception of God is the one propagated by Christian fundamentalists, then young people will reluctantly return to the dreary swamp of agnosticism.

That's why I don't want I.D. to be a "high-tech, modern" path to Christian fundamentalism. I want the tools that make I.D. "work" (a respect for reality and reason - regardless of the ideological consequences) to be adopted in all facets of our life - from personal ethics to faith.

I want a rational form of ethics; I want a rational faith. Something that nourishes the mind AND soul.

Intrigued? Go here for more.

-Todd

Wednesday, May 27, 2009

Quote of the Day

“The brain is necessary for consciousness. Of course! Just as an engine is necessary in a car. But an engine doesn't ‘give rise’ to driving; driving isn't something that happens inside the engine. The engine contributes to the car's ability to drive."

"Trying to understand consciousness in neural terms alone is like trying to understand a car driving down the road only in terms of its engine. It's bad philosophy masquerading as science.”
Philosopher Alva Noe

The Medium is the Message


“The medium is the message" is one of the most famous phrases in communications theory. Coined by Canadian philosopher Marshall McLuhan, that phrase captures the truth that a particular medium of communications (such as a TV program or a movie) affects society not only through its “content,” but also by the “characteristics of the medium itself.”

Per Wikipedia:

McLuhan claimed in Understanding Media that all media have characteristics that engage the viewer in different ways; for instance, a passage in a book could be reread at will, but a movie had to be screened again in its entirety to study any individual part of it.

So the medium through which a person encounters a particular piece of content would have an effect on the individual's understanding of it.

Some media, like the movies, enhance one single sense, in this case vision, in such a manner that a person does not need to exert much effort in filling in the details of a movie image.

McLuhan contrasted this with TV, which he claimed requires more effort on the part of viewer to determine meaning, and comics, which due to their minimal presentation of visual detail require a high degree of effort to fill in details that the cartoonist may have intended to portray.

A movie is thus said by McLuhan to be 'hot' (intensifying one single sense) and "high definition" (demanding a viewer's attention), and a comic book to be 'cool' and 'low definition' (requiring much more conscious participation by the reader to extract value).

This concentration on the medium and how it conveys information — rather than on the specific content of the information — is the focal point of 'the medium is the message.'"

Since McLuhan passed away in 1980, we can only speculate how the great Canadian would’ve interpreted the latest and greatest media technology: The Internet. By bringing such an unprecedented amount of information to the fingertips of computer users, it’s easy to understand why many experts have called Internet “the greatest invention for the dissemination of ideas and information since the Guttenberg Press.”

The Guttenberg Press, just to refresh your memory, was Johannes Gutenberg’s breakthrough invention of the mechanical printing press in the mid-1400’s. It revolutionized book-making, increased literary, and indeed, according to many historians, helped facilitate the Protestant Reformation.

From The Complete Idiot's Guide to the Reformation & Protestantism:

“An often overlooked but vital occurrence that helped make the Reformation possible was the invention of the printing press. This made the Bible and other printed materials from the reformers widely available to the European population.”

“The teachings and writings of the first figures of the Protestant Reformation (Luther and Melanchthon) were widely circulated…In short order, Luther’s 95 Theses, as well as his other words, were distributed through Germany. For the first time in church history, all classes of society had access to printed reformationist materials.

“It was a whole new world in Europe, a world in which the dominant religious establishment would be under an intense public scrutiny like it had never known.

“Historians look at the events that led to the Protestant Reformation as a fascination mixture of thinking, reading, and action – all converging to bring about a revolution that would change the whole Western world.”

To summarize: “The Medium is the Message.” And the greatest revolutionary “medium” (the printing press) led to one of the greatest religious upheavals in history (the Protestant Reformation). That’s something to consider when evaluating the Internet’s impact on society in the future.

The Internet isn’t just increasing the speed of information and changing the way we gather that information (via a computer screen), it is potentially changing the way we THINK about that information – exposing us to new ideas, and exponentially increasing our ability to communicate with other people ABOUT those ideas.

With that in mind, wouldn’t it fair to claim that new religious and political ideas seem appropriate for the “medium” of the Internet Age – especially when we consider that - were it not for the Internet - the I.D. movement would be sharply restricted in its ability to reach the public (since the Scientific Established has deliberated excluded I.D. from "mainstream" sources of information).

Plus, when you consider that the Internet Age is still in its infancy, it's reasonable to assume that the synergy between the Internet and Intelligent Design has major room for growth.

If we date the emergence of the Internet to 1995 (with the release of Netscape Navigator), today the Internet is just 14 years old. It is still a teenager struggling to master its newfound kills and abilities.

A decade from now, when the Internet turns a young adult, who knows what it might be capable of.

-Todd


**UPDATE, AUG. 19, 2009**

Still think social media is a fad? A clip 4-minute Youtube clip explains the Revolution.

Friday, May 22, 2009

Weekly Wrap-Up



An Excellent Speech by Mark Steyn on the Link Between Freedom, Faith, and Government:
"To rekindle the spark of liberty once it dies is very difficult. The inertia, the ennui, the fatalism is more pathetic than the demographic decline and fiscal profligacy of the social democratic state, because it's subtler and less tangible. But once in a while it swims into very sharp focus. Here is the writer Oscar van den Boogaard from an interview with the Belgian paper De Standaard. Mr. van den Boogaard, a Dutch gay 'humanist' (which is pretty much the trifecta of Eurocool), was reflecting on the accelerating Islamification of the Continent and concluding that the jig was up for the Europe he loved. 'I am not a warrior, but who is?' he shrugged. 'I have never learned to fight for my freedom. I was only good at enjoying it.'"

George Gilder on What Information Theory Tells Us About Biology: "Information is defined by its independence from physical determination... Like a sheet of paper or a series of magnetic points on a computer’s hard disk or the electrical domains in a random-access memory — or indeed all the undulations of the electromagnetic spectrum that bear information through air or wires in telecommunications — DNA is a neutral carrier of information, independent of its chemistry and physics....Wherever there is information, there is a preceding intelligence."

American Spectator Cover Story on Intelligent Design: "It is precisely because intelligent design relies upon scientific methods and evidence that it is regarded by the materialists as so extraordinarily dangerous. It threatens to allow religion to escape from the ghetto assigned to it by the dominant 19th- and 20th-century materialism...It might change conceptions about whether there is an objective moral order. It might help open minds that would otherwise be closed." (The article also contains a good explanation about the mathematical odds AGAINST natural selection being the source of complex life).

A profile of Philip Johnson, the "Father of I.D.": "I was struck by the breadth of Darwin's claims as opposed to how scanty were the observable changes...I said to my wife that I shouldn't take this up. I will be ridiculed and it will consume my life. Of course, it was irresistible."

Dr. Deepak Chopra on Evolution: The world-famous doctor provides a solid list of 12 major gaps in evolutionary theory. Then, when Martin Richard prints a rebuttal, Dr. Chopra takes the time to respond to his objections here.


Wednesday, May 20, 2009

One for the Nones



I almost missed this excellent piece by Washington Post columnist Michael Gerson, which pertains to a lot of the themes found in The Mustard Seed, and also on this blog (see here, here, here, and here).


I’ve posted the article below…


A Faith for the Nones
The Right Kind of Religion Would Bring the Young Back

There is a book that everyone will be talking about -- when it appears over a year from now. American Grace: How Religion Is Reshaping Our Civic and Political Lives, being written by Robert Putnam and David Campbell, is already creating a buzz. Putnam, the author of "Bowling Alone: The Collapse and Revival of American Community," is the preeminent academic expert on American civic life. Campbell is his rising heir. And the book they haven't yet finished will make just about everyone constructively uncomfortable.

At a recent conference of journalists organized by the Pew Forum on Religion and Public Life, Putnam outlined the conclusions of "American Grace," based on research still being sifted and refined. Against the expectations of hard-core secularists, Putnam asserts, "religious Americans are nicer, happier and better citizens." They are more generous with their time and money, not only in giving to religious causes but to secular ones. They join more voluntary associations, attend more public meetings, even let people cut in line in front of them more readily. Religious Americans are three to four times more socially engaged than the unaffiliated. Ned Flanders is a better neighbor.

Against the expectations of many religious believers, this dynamic has little to do with the content of belief. Theology is not the predictor of civic behavior; being part of a community is. People become social joiners and contributors when they have friends who pierce their isolation and invite their participation. And religious friends, says Putnam, are "more powerful, supercharged friends."

Yet this kind of religious affiliation has declined among many since World War II, especially among the young. The change was not gradual or linear. It arrived, according to Putnam, in "one shock and two aftershocks."

The shock came in the 1960s. As conservatives have asserted, the philosophy of sex, drugs and rock 'n' roll is an alternative to religious affiliation (though some of the rocking religious would dispute the musical part). Baby boomers were far less religious than their parents were at the same age -- the probable result, says Putnam, of a "very rapid change in morals and customs."

This retreating tide of commitment affected nearly every denomination equally, except that it was less severe among evangelicals. While not dramatically increasing their percentage of the American population, evangelicals did increase their percentage among the religious in America. According to Putnam, religious "entrepreneurs" such as Jerry Falwell organized and channeled the conservative religious reaction against the 1960s into the religious right -- the first aftershock.

But this reaction provoked a reaction -- the second aftershock. The politicization of religion by the religious right, argues Putnam, caused many young people in the 1990s to turn against religion itself, adopting the attitude: "If this is religion, I'm not interested." The social views of this younger cohort are not entirely predictable: Both the pro-life and the homosexual-rights movement have made gains. But Americans in their 20s are much more secular than the baby boomers were at the same stage of life. About 30 to 35 percent are religiously unaffiliated (designated "nones," as opposed to "nuns" -- I was initially confused). Putnam calls this "a stunning development." As many liberals suspected, the religious right was not good for religion.

The result of the shock and aftershocks is polarization. The general level of religiosity in America hasn't changed much over the years. But, as Putnam says, "more people are very religious and many are not at all." And these beliefs have become "correlated with partisan politics." "There are fewer liberals in the pews and fewer unchurched conservatives."

The political implications are broad. Democrats must galvanize the "nones" while not massively alienating religious voters -- which is precisely what candidate Barack Obama accomplished. Republicans must maintain their base in the pew while appealing to the young -- a task they have not begun to figure out.

But Putnam regards the growth of the "nones" as a spike, not a permanent trend. The young, in general, are not committed secularists. "They are not in church, but they might be if a church weren't like the religious right. . . . There are almost certain to be religious entrepreneurs to fill that niche with a moderate evangelical religion, without political overtones."

In the diverse, fluid market of American religion there may be a demand, in other words, for grace, hope and reconciliation -- for a message of compassion and healing that appeals to people of every political background. It would be revolutionary -- but it would not be new.



Does This Look Like One of Your Relatives?



I’m not going to waste a lot of time critiquing the latest claim that scientists have discovered the “missing link” which “proves” that human beings descended from monkeys. But here’s a broad outline of the facts:

What we see above is a 47-million-year-old fossilized skeleton of a “lemur monkey” affectionately named Ida. Why is Ida so special?

She has “human-like nails instead of claws” and “opposable big toes." Plus the shape of her talus bone (which is in the foot) apparently resembles that of humans.

That’s all very nice and good. But to be honest, I’m struggling to emotionally connect with this 1 foot, 9 inch specimen whose head and tail reminds me of a T-Rex (and looks like no human I’ve ever met)?

While some claim that the impact of Ida is “like an asteroid falling down to Earth,” I’m not quite willing to be so dramatic. After all, here’s another way to interpret the evidence:

1) 47 million years ago, there was a lemur monkey who had nails, opposable big toes, and a state-of-the art talus bone.

2) The end.

See what I’m getting at? Is it not possible that what we’re witnessing here is just another monkey? That it has nothing to do with humans? Or am I missing something?

Apparently, I'm not the only the one who's skeptical, because NBC’s chief science correspondent Robert Bazell is also ignoring the hype. In this article, he quotes Dr. Tim White, a paleontologist at the University of California, Berkeley.

Dr. White said, "Three words: Over the top."

Robert Bazell continues:

“The people who promoted this event make a big deal out of the possible place this newly discovered fossil plays in the evolution leading to humans. But if you read their actual scientific paper in a respectable peer-reviewed scientific journal (http://www.plosone.org/article/info:doi/10.1371/journal.pone.0005723) the scientists make no such claim.

“The big question about this finding, White said, ‘is whether it is the 'Mother of All Monkeys?’ and that is not even resolved. With years of study the scientists will learn whether this is the creature that stands at the intersection of one group of primates that went on to be best represented by lemurs today or another group that went on to be chimps and humans. But they don't know yet."

I guess the message is: “Let’s not get carried away here.”

Uh oh….Too late...



In my Apr. 23 blog post, Materialism is Dead: Now What?, I wrote:

I've broken down neo-Darwinism into four main ideas: (1) that life can be produced "by chance" in a soup of chemicals, 2) that life can come from non-living matter, 3) that genetic mutation is the key to the creation to new species, and 4) there is a logical evolutionary continuum between apes and humans.

Literally 150 years after Darwin published On the Origin of Species, there is still ZERO evidence for the first 3 tenets, and surprisingly little evidence for the final one.

Even with this week’s events, that analysis remains intact.

**UPDATE, MAY 22, 2009**


I like this comment at Uncommon Descent:

"It’s just hard to believe that all of humanity (including Mozart, Newton, Da Vinci, etc) descended from what looks like a piece of roadkill someone peeled off an interstate."

A few more articles expressing some much-needed skepticism towards the "missing link"...

Science writer Brian Switek...

"The bottom line is that the hypothesis that Darwinius is closer to anthropoids than tarsiers or omomyids does not have strong support… The grand claims about it being our ancestor, though, can not be upheld as true… They have gone hand-in-hand with the History Channel to create an aura of sensationalism around the fossil…I can only hope that Darwinius will eventually receive the careful analysis it deserves."

Chris Beard, who is the curator of vertebrate paleontology at the Carnegie Museum of Natural History:
"Ida is not a 'missing link' – at least not between anthropoids and more primitive primates. Further study may reveal her to be a missing link between other species of Eocene adapiforms, but this hardly solidifies her status as the 'eighth wonder of the world.'
London Times science correspondent Mark Henderson:

"There is a feeling out there that publication has been rushed, and that the data don't fully support the sweeping claims that are being made.

"It is far from certain that the adapids, the group to which Ida belongs, are the ancestors of modern monkeys, apes and humans. The consensus view is that the adapids were an evolutionary dead end, and that anthropoids (monkeys etc) are the descendents of animals that looked more like modern tarsiers. This would be an issue even if this discovery had been announced in the normal way. But it's especially serious given the publicity blitz behind Ida.

"PLoS ONE, like most journals, normally release their papers to journalists under embargo, to give them good time to prepare a story and consult independent experts.

"For media outlets that had bought the rights, such as the BBC, it was a different story -- full access, weeks or even months in advance. Is it really right that full embargoed access to important and controversial research findings should be restricted on the say-so of the authors, to media that best suit their publicity strategy? Especially when money has changed hands?"

And finally, a parody by science writer Ed Yong...:

“Yesterday, the entire world changed noticeably as the media, accompanied by some scientists, unveiled a stunning fossilised primate. The creature has been named Darwinius masillae, but also goes by Ida, the Link, the Chosen One and She Who Will Save Us All.”


**UPDATE, MAY 27, 2009**

If you ain’t bored yet…here's even more Ida news...

From Times Online...

Christophe Soligo, a specialist in early primate evolution at University College London, is an admirer of Ida – but concerned over what he fears may be hype.

“This is an absolutely amazing fossil,” he said. “But to suggest she might be the missing link in human evolution is simply too much. There is a great risk of discovery bias, where we read too much into a good fossil just because we have it available.”

Robert Foley, professor of human evolution at Cambridge University, believes many people misunderstand the huge timescales involved in assessing fossils.

“This animal lived around 47 million years ago but human-like creatures only appeared in the last 2 million years,” he said. “That’s a gap of around 45 million years with many other species lying between us and that era. Any one of them could be called a missing link. Really, the term is meaningless.”

The science and the hype have had one unexpected benefit, however - they have unified in outrage two famous rival paleontologists: Elwyn Simons of Duke University, who maintains that primates emerged out of Africa, and Christopher Beard, curator of the Carnegie Museums of Pittsburgh, who counter-argues for an Asian Eden.

“Dr Simons phoned me for the first time in 10 years to share his outrage about this malarkey and, for the first time in a decade, I agree with him,” said Beard last week.

“First, the paper is shoddy scholarship because it avoids comparing Darwinius masillae with similar fossils to put it into a proper context. The roll-out was extraordinary and it is now clear that the scientists were under pressure to meet the showbusiness deadlines. The tail was wagging the dog– or maybe the lemur.”

Simons does not buy the spin either. “It’s absurd and dangerous,” he said. “This is all bad science and it plays into the hands of the creationists, who look for any excuse to discredit evolution.

“Darwinius is a wonderful fossil, but it is not a missing link of any kind. It represents a dead end in evolution. It tells us nothing that we do not already know, except that people will be overwhelmed by hype.”

**UPDATE, JUL. 21, 2009**



Scientific American pours on

“On May 19 the world met a most unlikely celebrity: the fossilized carcass of a housecat-size primate that lived 47 million years ago in a rain forest in what is now Germany… Ida’s significance was described in no uncertain terms as the missing link between us humans and our primate kin… But a number of outside experts have criticized these claims. Not only is Ida too old to reveal anything about the evolution of humans in particular (the earliest putative human ancestors are a mere seven million years old), but she may not even be particularly closely related to the so-called anthropoid branch of the primate family tree that includes monkeys, apes and us.”

“Critics concur that Ida is an adapiform, but they dispute the alleged ties to anthropoids. Robert Martin of the Field Museum in Chicago charges that some of the traits used to align Ida with the anthropoids do not in fact support such a relationship. Fusion of the lower jaw, for instance, is not present in the earliest unequivocal anthropoids, suggesting that it was not an ancestral feature of this group. Moreover, the trait has arisen independently in several lineages of mammals—including some lemurs—through convergent evolution. Martin further notes that Ida also lacks a defining feature of the anthropoids: a bony wall at the back of the eye socket. ‘I am utterly convinced that Darwinius has nothing whatsoever to do with the origin of higher primates,’ he declares.

“Adapiforms ‘are related to the strepsirrhine group of living primates that include lemurs from Madagascar and galagos [bush babies] and lorises from Africa and Asia,’ insists paleontologist Richard F. Kay of Duke University. Claims by the authors to the contrary notwithstanding, he adds, ‘they are decidedly not in the direct line leading to living monkeys, apes and humans.’ Kay and others believe that a primitive primate from China called Eosimias is a better candidate ancestor of anthropoids than is Darwinius.”

**UPDATE, OCT. 23, 2009**

New article on Ida Fossil Controversy: Oh Ida, Where Have Thee Gone?