Saturday, May 23, 2009

3-Year College Degrees: Maturity Required

In today's Washington Post: Colleges Weighing 3-Year Degrees to Save Undergrads Time, Money. The article talks about how several American colleges are considering degree programs that take just three years to complete, not the customary four. The cost of a college education might thus be slashed by 25 percent.

But Justin Guiffre, a 19-year-old sophomore at George Washington University in Washington, D.C., cautions, "A three-year program could be appropriate for students who demonstrate commitment, academic excellence and maturity."

Maturity, self-control, and the ability to put off the immediate gratification of urges are excellent predictors of academic success (even more so than I.Q. — see Don't Eat the Marshmallow!, particularly the part about the research of Angela Lee Duckworth). They are rough synonyms of commitment, since that word implies being able to keep one's eyes on the ball for extended periods, until a difficult goal is achieved. So "students who demonstrate commitment, academic excellence and maturity" is a phrase twice redundant ... which is fine, because redundancy is emphasis, and emphasis needs to be put on the need for maturity among young people today.

Wednesday, May 20, 2009

Shock-Trauma — Not!

In Rites of Initiation I talked about how myth guru Joseph Campbell, in his The Power of Myth interviews with Bill Moyers, decried the fact that there are no longer any puberty rites to yank young people into mature adulthood.

In my own experience, gains in personal maturity come when there is a confluence of circumstances. First, there is something which precipitates trauma; one knows one is in trouble somehow, even if one has no idea what to do about it.

For me, trouble ensued upon the death of my mother in 1985.

The trouble can be mercifully brief, or it can last and last. If it lasts, part of the reason is often that we resist any and every imaginable strategy of relief.

In my case, the trouble lasted several years, during which time I was ill both physically and psychologically.

Next, there is typically a second precipitating event. This one somehow makes clear in our mind what general path we need to take out of the chaos of whatever trouble we have found ourselves in.

In my case, while I was sick in bed in 1990 I read for the third time a book by J.I. Packer called I Want To Be a Christian. The second precipitating event came when I was part-way through: for the first time, suddenly I was able to say of the Christian belief system, "I believe this!" Before, it had always been, "I just can't believe."

Often, there follows an information-gathering phase that will allow us to choose which particular strategy to use in following that general path.

For me, I bought and read Leo Rosten's Religions of America and C.S. Lewis's Mere Christianity, picked the Episcopal Church, and had a friend take me to a local parish, where I prepared for baptism and was duly baptized. (I have since become a Roman Catholic.)

My overarching point is, though, that there has to be some precipitating trauma — some sort of physical or emotional scarification that brings on personal chaos for a time that can be brief or protracted. Only then can (following a second precipitating event) there be healing. As we heal, we gain in maturity.


Our culture today (as Joseph Campbell pointed out) has gone to ever greater extremes to avoid doling out trauma to the young.

We find this on the secular front: our schools bend over backward to keep from bruising kids' egos and self-images. Corporal punishment is forbidden.

On the religious front, there are no frightening rites of passage such as existed in ancient and primitive societies. There are no ritual circumcisions of pubescent males. No scarifications. Not even any old-time baptisms where someone holds your head under water long enough for your life to flash in front of your eyes.

We allow no ritual doling out of trauma to our kids in part because there is such potential for abuse. In part, though, the problem is that to do so would mandate that the whole community agree on a belief system which justifies the (secular or religious) rite. But just the opposite happens. Parents say, "No one is going to traumatize my kids but me!"

Meanwhile, few parents spank. Few mete out harsh punishment. Few insist on adult-behavior-or-else.


Is it any wonder that so many young people today move back in with Mom and Dad after high school, or college, or grad school? Any wonder that they're waiting to get married until their late twenties, waiting to have children of their own until their early thirties? (Except, of course, for the recent upsurge in unmarried high-school girls getting pregnant in bunches for all the wrong reasons?)

We live in a culture that is averse to shock-trauma, and so sees ever-more-infantile behavior from supposedly adult human beings!

Monday, May 18, 2009

Rites of Initiation

Something human culture used to do well, but doesn't do well anymore, are rites of initiation by virtue of which children at about the time of puberty are ushered into adulthood.

Mythology guru Joseph Campbell told Bill Moyers in The Power of Myth about the need for puberty rites (p. 8):
Moyers: Society has provided [young people] no rituals by which they become members of the tribe, of the community. All children need to be twice born, to learn to function rationally in the present world, leaving childhood behind. I think of that passage in the first book of Corinthians: "When I was a child, I spake as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things."

Campbell: That's exactly it. That's the significance of the puberty rites. In primal societies, there are teeth knocked out, there are scarifications, there are circumcisions, there are all kinds of things done. So you don't have your little baby body anymore, you're something else entirely.
(You can read more from the book in this MySpace blog entry; scroll down about halfway for this quote. Or, click here to search inside the book; enter "initiation" in the search field.)

Though I am not Jewish, I am, by Campbell's mention of circumcisions, put in mind of the Bar Mitzvah/Bat Mitzvah ceremony at which a 12-year-old girl or 13-year-old boy participates for the first time as an adult in the regular Sabbath service. In today's Judaism it has become a big deal, not unlike a combination of the First Holy Communion and Confirmation sacraments in the Catholic Church, but bigger still.

Jewish boys (but not girls) are ritually circumcised, not at puberty, but within a few days after birth.

Possibly (see this Wikipedia article) male circumcision originated as a rite of passage marking a boy's entrance into adulthood, though other intents have been cited as well. Campbell seems to favor the "rite of passage" interpretation. He also mentions "scarifications," the imposition of scars on the skin to signify the passage into adulthood. Modern civilization toned down the rites of passage such that by the time of Campbell's youth in some 100 years ago, the conferring of the right to wear long pants stood in for circumcisions and scarifications.

Today, in the hip hop generation, we find adult males wearing short pants in situations which would seem to demand trousers.

Campbell further tells Moyers (p. 82):
As a Catholic boy, you choose your confirmed name, the name you are going to be confirmed by. But instead of scarifying you and knocking your teeth out and all, the bishop gives you a smile and a slap on the cheek. It has been reduced to that. Nothing has happened to you. The Jewish counterpart is the bar mitzvah. Whether it actually works to effect a psychological transformation will depend on the individual case, I suppose. But in those old days there was no problem. The boy came out with a different body, and he had really gone through something.
Also (pp. 81-82):
... we know what the aborigines do in Australia. Now, when a boy gets to be a little bit ungovernable, one find day the men come in, and they are naked except for stripes of white bird down that they've stuck on their own bodies using their own blood for glue. They are swinging the bull-roarers, which are teh voices of spirits, and the men arrive as spirits.

The boy will try to take refuge with his mother, and she will pretend to try to protect him. But the men just take him away. A mother is no good from then on, you see. You can't go back to Mother, you're in another field.

Then the boys are taken out to the men's sacred ground, and they're really put through an ordeal — circumcision, subincision [a modification of the urethral opening of the penis], the drinking of men's blood, and so forth. Just as they had drunk mother's milk as children, now they drink men's blood. They're being turned into men. While this is going on, they are being shown enactments of mythological episodes from the great myths. They are instructed in the mythology of the tribe. Then, at the end of this, they are brought back to the village, and the girl whom each is to marry has already been selected. The boy has now come back as a man.

He has been removed from his childhood, and his body has been scarified, and circumcision and subincision have been enacted. Now he has a man's body. There's no chance of relapsing back to boyhood after a show like that.

Moyers: You don't go back to Mother.

Campbell: No, but in our life we don't have anything like that. You can have a man forty-five years old still trying to be obedient to his father. So he goes to a psychoanalyst, who does the job for him.
This gives us profound understanding as to why there is a "maturity gap" today.

Self-Policing Freedom

Nils Lofgren, Taking Memos From the Boss appears in today's Washington Post. It gives several quotes from that ace guitarist in Bruce Springsteen's E Street Band, Nils Lofgren. In answer to the question, "There's some controversy surrounding the show here. TicketsNow, the Ticketmaster-owned reseller, sold some tickets to the show that it didn't have, leaving some fans in the lurch. Do you follow that sort of news at all?", Mr. Lofgren says:

Not too much. . . . I read what Bruce commented about it, how wrong it was, and I'm obviously in total agreement. Listen, that's why the whole planet's getting run down, because of greed and freedom gone unchecked. Sadly, there's people that aren't burdened with a conscience. Greed is king. We don't police those people well enough as a society, and that obviously has to change or things are just going to keep getting worse. This is just a microcosm of what's wrong with the planet. Look at Bernie Madoff.

America to me is the greatest country in the world and the greatest experiment in freedom we'll ever have. But our forefathers, I'm sure, expected us to police freedom appropriately. In other words, you shouldn't have freedom to pipe in porn to kids' computers . . . You shouldn't have freedom to murder and pillage and get out on good behavior in seven years. What the hell is that about? That's the great challenge of society, is to police freedom appropriately. It's not happening at the moment.

An important question, policing freedom appropriately. At first, "policing freedom" sounds like an oxymoron, a contradiction in terms. This blog maintains that the best way to resolve the tension between freedom and policing is self-policing.

In Rededicating This Blog I talked about words such as self-control, maturity, forbearance, and chastity. Self-policing is a good umbrella term for all of those.

To be an adult is to have the ability, ostensibly, to self-police ... which is why Mr. Lofgren bemoans the "freedom to pipe in porn to kids' computers." Adults have the right to look at porn on their computers. They also have the right to forbear looking at porn. In my earlier post, I gave "an abstaining from the enforcement of a right" as a definition of both forbearance and chastity. It could also be given as a definition of the umbrella term, self-policing.

Self-policing is impossible for the immature. For those with just a little bit of maturity, self-policing is a hit-or-miss proposition. For the truly mature, self-policing is a given.

Sunday, May 17, 2009

Rededicating This Blog

This blog is being renamed and rededicated! Once named "In Search of Solidarity," it is now called "In Search of Maturity." (It also has a new look.)

The change is fitting for several reasons. One is that this blogger will turn 62 this year. Quite obviously, if he doesn't gain some sort of maturity at this point in his life, he never will.

Second, I have, finally, in fact turned a corner on my own personal maturity. The maturity I am in search of is actually more that of others in the society and culture, particularly younger people.

Third, as a result of that indefinable something finally falling into place for me personally, in the last few months — as if a capstone has at last been dropped into a waiting arch — what has come into focus for me is the idea that our culture presently suffers from a massive "maturity gap." I'd like to do what I can to help close that gap.


To close the gap is hard, in part because defining the gap and demonstrating that it exists is hard. I would like to begin by citing Don't Eat the Marshmallow!, an earlier post to this blog. The thrust of that post was that an article recently in The New Yorker, "Don't!", furnishes evidence about something I consider to be a near-synonym of maturity, self-control.

Specifically, science has found that self-control is a thing that we can develop, but often fail to. We're not stuck with however much or little we are born with.

Scientists have experimented with children as young as four and found wide disparities in their ability to delay eating a marshmallow or Oreo cookie placed before them while their adult supervisor attends to an errand outside the room. When told to ring a bell if they can't wait any longer, some do and some don't speed up the instant of gratification, with those who don't ever ring the bell and who thus duly put off eating the goodie getting the promised reward of an extra goodie, once the adult returns.

And some children not only don't wait, they don't even bother to ring the bell. They just snatch the goodie and consume it. Some even find a way to (seemingly) fool the adult: lick the cream filling out of an Oreo and reassemble it to escape detection.

Experimenters have looked into the later lives of the impatient "low delayers" (these experiments were originally conducted in the late 1960s) and found that some have become "high delayers," and that these lucky individuals have had more successful lives than the perennial snatchers and grabbers who never learned to postpone gratification.


I would argue that the ability to postpone gratification is the essence of maturity, by whatever name you wish to call it: self-control, self-discipline, self-denial, etc.

I would further suggest that another name for the same phenomenon is "chastity." Here is a word, I know, that immediately rings bells and sounds loud buzzers in the culture, for the first thing it brings to mind is sexual forbearance. Isn't that type of forbearance something the culture gratefully laid to rest in the 1960s? Isn't it the general (and proper) understanding today that sex by and between consenting adults is perfectly fine, whatever the circumstances of those adults?

My response:

First of all, chastity, properly understood, is more than sexual forbearance. It is (as is forbearance per se) a word that means "a refraining from something; patient endurance; self-control; an abstaining from the enforcement of a right." To be chaste is to be pure and virtuous, stainless and undefiled. Obviously, abstaining from the wrong sorts of sex is a big part of chastity, but it's only the tip of the iceberg. There are plenty of ways for someone who is totally celibate, sexually speaking, to offend against chastity.

Second, I may as well say this right out loud, right now, rather than tiptoe around the issue: I no longer believe in the Sexual Revolution.


In fact, I have this confession to make: the recent arrival of the capstone in my arch of maturity coincided with my giving up masturbating, a practice which the Sexual Revolution said was as natural and healthful as eating and breathing.

As I indicated in God of Chastity, Part II and earlier posts in my Theology of the Body series about an important theological outlook espoused by the late Pope John Paul II, my church preaches chastity. This past Lenten season, I began practicing it by stopping masturbating.

Specifically, I gave up looking at porn on the Internet and doing what comes naturally when one looks at porn ... a thing one also tends to do many times, as well, when not looking at porn on the Internet.

More specifically yet, the kind of dirty pictures I favored when looking at online porn were not "normal," but oriented toward a particular fetish or perversion. I'm not going to say what it was ... it's a preoccupation that a lot of men have, and not a few women as well, in which something bodily that is not intended to provoke lust does anyway. I now realize that entertaining such preoccupations, though no one is really harmed by it, is nothing if not immature.

And this is now a blog against immaturity!


I bring these personal things up because I believe a lot of the sexual behavior that goes on today without any stigma of illicitness is just the acting out of immaturity and the inability to postpone gratification. It has been so since very early in my life — I was born in 1947 — but the pace picked up with the Sexual Revolution of the 1960s. And, as the young people of my Baby Boomer generation threw off the old shackles of sexual repression, they also gave evidence of being notably less mature in other ways.

I witnessed this in my own young life in the form of the cataclysmic violence that erupted in the late 1960s as college and university students tried to "tear down the walls" and summarily enact a political revolution.

I was a junior at Georgetown University in 1968 when opposition to the war in Vietnam turned ugly. Students who were "clean for Gene" early in the year — for Senator Eugene McCarthy, who ran as an antiwar candidate against incumbent President Lyndon Johnson — watched as the hope engendered by his positive early results in the New Hampshire primary and elsewhere were doused by the killings of Martin Luther King and Bobby Kennedy. King was the crucial link between the civil rights movement and establishment politics, and Robert Francis Kennedy was the man most likely to translate McCarthy's quixotic crusade into a successful peace Presidency.

With their snuffing out, leaders of the "movement" for civil rights and peace descended on Chicago during the week of the Democratic National Convention bent on goading the police into starting a riot — which is what actually happened. Young self-styled radicals defended what they had wrought in terms of no longer being willing to wait for peace and justice.

No longer being willing to wait ... that, par excellence, is a recipe for the victory in one's soul of the precipitous and the immature.

Did their coming to prevail have anything to do with the Sexual Revolution? The question answers itself when you consider that the old order of sexual propriety was based on the notion that sex before marriage is a sin. When sex before marriage is a sin, sex outside of the duly sanctified bridal chamber is out of the question entirely. Throwing such notions aside, I believe, was manifestly a recipe for the onset of radical immaturity in other avenues of life.

Believe me, I wish it were otherwise. I wish we could have our marshmallow and eat it, too: that we could be faultlessly mature in other areas of life while not reining ourselves in sexually in the least degree. But my experience is that we are not built that way, and the world does not work that way.

I don't plan to say all that much about sex in my rededicated "In Search of Maturity" blog, though. I know of no better way to turn what I would hope to be a reasonable discussion into a food fight than to become a scold about the kind of behavior I know people want to hear nothing negative about.

Besides, there are any number of topics related to maturity and self-control that don't ask people to give up such a mainstay of their present lifestyle as "illicit sexuality" ... until they're ready to do so, that is. Lord knows, that was literally the last thing I myself wanted to give up!

* * *


Before I shut up, a few words about the quotation at the head of my blog. "Use the Force, Luke!" are words that echoed in the thoughts of new-minted Jedi Knight Luke Skywalker as he led the rebels' attack against the Death Star of Darth Vader in the original Star Wars movie in 1977. Luke had learned them from his mentor, old Ben Kenobi, when being trained in the ways of the Force.

Using the Force is, I'd say, a metaphor for the wisdom taught by all our religions, a wisdom that comes down to the values of self-control, discipline, and chastity in the broad sense in which that word is spoken of above. Luke had, or developed, such virtues as a Jedi knight-in-training, in contrast to the gruff dissolution of the lifestyle of Han Solo. Han was by no means evil like Darth Vader; he was good, but as a cynical hothead he was clearly no candidate for Jedi knighthood.

The character of Obi-wan Kenobi made manifest such wisdom and maturity as young Luke was in need of en route to his knighthood. Often, those in the presence of Ben Kenobi would automatically do as the Force would have them do, no matter how much they were customarily in thrall to the Dark Side. That's why I hereby make "Use the Force, Luke!" the epigraph to this blog.

Friday, May 15, 2009

Kathleen Parker on Francis Collins

Washington Post columnist Kathleen Parker's piece about Francis Collins, A Physician-Geneticist Seeks to Foster Both Faith and Science, is a must-read for anyone interested in the question of whether one can be a Christian and believe in evolution, too.

Francis Collins headed the Human Genome Project, a big part of the scientific effort that cataloged our species' genes, so he is a scientist par excellence. He is a believer in the theory of evolution. A noteworthy Christian as well, he does not see any conflict between the two.

Everybody knows that many Christians disagree on this. Of those who think evolution is anti-Christian, Parker writes, "Their objections haven't changed much since Billy Sunday first articulated them almost 100 years ago and revolve around the fear that acceptance of evolution negates God."

That, then, is the heart of the matter. If Christians could come to see that evolution does not — repeat, does not — negate God, the world would be a better place.

"To that end," writes Parker, "[Collins] created the BioLogos Foundation and last month launched a Web site — BioLogos.org — to advance an alternative to the extreme views that tend to dominate the debate ... Through the foundation and Web site, Collins is hoping to help home-schoolers and other Christian educators come to grips with their scientific doubts. Among other projects, he intends to develop curricula that combine faith and science."

To which I say, more power to him!


We Christians need to embrace evolution, I would say, because for some of us to hold out against it not only splits the church, it splits society at large along fault lines that don't really need to be there. It unnecessarily puts off people who are not yet believers. It makes all religion seem, to some, foolish and dispensable.

Plus, Parker cites Collins as arguing that:
The problem of not believing in evolution as one might not believe in, say, goblins or flying pigs has repercussions beyond the obvious — that the United States will continue to fall behind other nations in science education. Collins says that many creationist-trained young people suffer an intense identity crisis when they leave home for college, only to discover that the Earth is about 4.5 billion years old. Talk about messing with your mind.

Collins says he hears from dozens of young people so afflicted. Most susceptible to crisis are children who have been home-schooled [as was Collins himself, until the sixth grade] or who have attended Christian schools.


I am well aware how hard it is to get evolution opponents to, so to speak, see the light. Darwin's theory (as brought up to date over the century-and-a-half since he published On the Origin of Species in 1859) is hard to grasp. It demands belief in things that seem wholly counterintuitive, like the ability of hereditary material — genes, DNA — to stay basically the same from one generation to the next, and on to the next, except of course when it does change.

That is what "mutations" are, changes in genetic material, and they happen a lot. It's just that, as there are so many genes per organism, "a lot" turns out to be an exceedingly tiny percentage of the total number of genes passed to offspring from their parents. Too, most mutations either make no tangible difference to survival chances, or if they do make for palpable change, they can get snuffed out in the offspring that bear them because they make that offspring less well adapted to the environment.

Such "survival of the fittest" is key to evolution. In a dangerous world, more offspring are produced than can possibly make it to adulthood and produce yet more offspring. Which ones produce their own posterity and which ones don't is often determined by which ones have genetically derived characteristics that fit them best to the circumstances under which they find themselves living.

Darwin's name for this reality was "natural selection." Evolutionists emphasize that it is purposeless and blind. Even so, along with the mutations that occur at random in genetic material and the "superfecundity" which ensures that there are way more offspring born than can possibly survive, it is the guarantor of evolutionary change.

If one is an evolutionist, one also has to believe that changes in the stuff of heredity, when they are not vetoed by natural selection, take uncountably long amounts of time — hundreds of thousands, if not millions, of years — to make major changes in what their host organisms look like or how they function, very occasionally leading over eons and eons to the actual emergence of a brand new species.

Still, our planet's fossil record seems to show — "fossils" are remains of living organisms that have been preserved somehow — that there are bursts of "speciation" in which large numbers of new species arrive all at once. These short bursts punctuate much longer periods of stasis and equilibrium. There are also occasional "mass extinctions" in which large numbers of species rapidly perish.


To get a new species to emerge, nature first has to find a way to make sure two populations of a single ancestor species get "reproductively isolated" from each other, perhaps owing to the interposition of a geographic barrier. One branch will then start adapting to a changed environment and, typically millions of years later, Bob's your uncle (as the British say). If the barrier is then removed, the two formerly isolated groups will no longer interbreed; they are separate species.

Geologically, the Earth is something like 4.6 billion years old, and the first organisms, single-celled bacteria or the like, appeared perhaps 2.7 billion years ago. To our minds, things remained pretty dull until the seemingly rapid appearance of most of today's major groups of complex organisms around 530 million years ago, in the Cambrian explosion.

As of about 65 million years ago, dinosaurs roamed the Earth. When they went extinct, little organisms who fed their young on milk — the early mammals — began to have a field day. For the first time, the world was their oyster.

By some five to seven million years ago our closest living cousins, the chimpanzees, were well-established in Africa. From them, the first species of our own genus, Homo, split off. (See Human evolution in Wikipedia.) Our own particular species, Homo sapiens, is all that is left of that genus. Its earliest known examples, called "archaic Homo sapiens," arose less than one half million years ago. It's hard to be absolutely sure these early humans were precisely of our species, though most scientists say they were.

Roughly 250,000 years ago, in a time between two ice ages, Homo sapiens proper appeared. By 160,000 years ago, the first "anatomically modern" version of our species had arrived.


Less than 10,000 years ago, certain humans began worshiping a single God, Yahweh. The earliest known copy of the Torah dates from the 7th century B.C.E., about 9,000 years ago. From it and its Abrahamic tradition descend the Judaism, Christianity, and Islam of today.

The Torah — the most holy scripture of Judaism, revered by Christians as the first five books of their Old Testament — begins with the book of Genesis, whose first chapter describes how the Lord, the One God, in his power and might created the world and everything in it, including all living kinds and mankind itself.

Today's creationists insist that this creation story must be read literally, for it must be literally true or true not at all. If it is literally true, the theory of evolution cannot be right.

I say the creationists are dead wrong in their belief. Genesis, chapter one, was written (or written down, out of earlier oral tradition) by and for a pre-scientific people who could not have understood today's scientific evolution theory. The point of the story was, and remains, the establishment of the fact that the Lord God is a unique deity of incomprehensible power and might.

God is supernatural, and how He creates the world is beyond our categories of thought. Yet stories must be told. But why would we who believe in Him want to cage His unimaginable power and might to create the world as He sees fit within any particular story and its entirely human categories of thought?

Francis Collins, according to Kathleen Parker, puts it this way: "To Collins, Darwin is a threat only if one thinks that God is an underachiever. Collins doesn't happen to believe that. His study of genes has led him to conclude that God is both outside of nature and outside of time. He's big, in other words. The idea that God would create the mechanism of evolution makes sense."

Wednesday, May 13, 2009

Don't Eat the Marshmallow!

An interesting article in The New Yorker of May 18, 2009: "Don't!" by Jonah Lehrer.

The article is about the scientific investigation of self-control. Decades ago, a psychologist named Walter Mischel experimented to see whether there were meaningful differences among four-year-olds' abilities to postpone gratification. He left them alone in a room with instructions not to eat a marshmallow until he returned from an errand outside the room, at which time they could have two marshmallows, not just one. But if they couldn't wait, they could ring a bell, and he would come back in right away and give them just a single marshmallow.

Some of the kids could delay gratification for as long as fifteen minutes, the outer limit imposed by Mischel; others held out for three minutes or just 30 seconds before the bell got rung, and some stuffed a marshmallow into their mouth right away without even ringing the bell. One lad, tested with Oreo cookies rather than marshmallows, opened a cookie and licked out the cream filling before slyly re-assembling the cookie and returning it to the tray.

In short, some of the kids were "high delayers" at the age of four, while others were "low delayers." One might assume the large differences at such a young age to be genetic, but Mischel believes there is a dance between nature (genes) and nurture (upbringing; environmental factors) such that either/or questions about those two poles of behavior causation are meaningless.


My interest in this has to do with my belief that postponing gratification and the practice of self-control are crucial to traditional religious standards of morality.

We Christians pray, "Lead us not into temptation but deliver us from evil." For a four-year-old, temptation is being left alone in a room with a tray of marshmallows they aren't permitted to touch, and evil is stealthily consuming the filling of an Oreo cookie without letting on what one has done.

Whether it's waiting until Christmas morning to open presents or waiting until marriage to have sex, postponing gratification has always been part and parcel of the Christian experience.

Speaking just for myself as a baby boomer, my personal experience has been one of becoming a consistent "high delayer" only later in life. This seems to have been a result of becoming, in midlife, a Christian in more than name only.


Before I was 40, I was not religious. And in many ways I'd say I was a low delayer, though never one who would grab the marshmallow without ringing the bell. Accordingly, I think Mischel and the other researchers mentioned in the article ought to make a careful distinction between low delayers who nonetheless play by the rules, and those who secretly filch Oreo fillings and look innocent when the adult returns.

Another thing: in my experience there seem to be two classes of evildoing. I was never tempted to become a bully. I have always been appalled at snobbery and intolerance. I have had relatively little trouble getting on board with feminism, inclusiveness, and multiculturalism.

Consuming illicit marshmallows (or other banned substances) was more my style. Being in a big hurry about obtaining gratification was something else again ... something that waxed and waned over the course of my formative years, exactly as Mischel's research shows it might. Mischel believes high and low delayers are determined at least in part by the specific contexts in which the behavior is taking place.

Low delayers can use various tricks to make themselves high delayers in a specific context, says Mischel. For example, he successfully trained kids faced with the task of not eating a marshmallow to imagine it as a picture in a frame, or a fluffy cloud. He found the kids that were already high delayers had their own tricks, such as occupying their minds with other pursuits while they waited.


Mischel's original research has been followed up by himself and other experimenters who are interested in, among other things, how the four-year-olds from the late 1960s fared in adulthood. In general, the high-delay children have done better in several areas of life. This result has been confirmed by other research. For instance, Angela Lee Duckworth has found that "the ability to delay gratification — eighth graders were given a choice between a dollar right away or two dollars the following week — was a far better predictor of academic performance than I.Q."

Of the original "marshmallow subjects," the article says, some "failed the marshmallow task as four-year-olds but ended up becoming high-delaying adults. 'This is the group I’m most interested in,' [Mischel] says. 'They have substantially improved their lives'." For this to happen, ad hoc mental tricks are not enough; "the real challenge is turning those tricks into habits, and that requires years of diligent practice."

It is my belief that religion does precisely that. It starts out giving us tricks to delay gratification in specific circumstances and thus gradually turns waiting and postponing into a way of life.

For instance, Christians have a way of imagining that Jesus walks beside them at all times. Of course, the answer to "What would Jesus do?" is never to steal the filling from an Oreo cookie.

Jesus's counsel would ever be, "Don't eat the marshmallow!" At some point we learn not to partake too soon without always needing to have His voice in our ear, and we have the basis to begin living an "It's not all about you" life.